You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/03/18 20:01:38 UTC

Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #937

See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/937/display/redirect>

Changes:


------------------------------------------
[...truncated 26.65 KB...]
Collecting azure-storage-blob>=12.3.2
  Using cached azure_storage_blob-12.15.0-py3-none-any.whl (387 kB)
Collecting azure-core>=1.7.0
  Using cached azure_core-1.26.3-py3-none-any.whl (174 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Collecting boto3>=1.9
  Using cached boto3-1.26.94-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.94
  Using cached botocore-1.29.94-py3-none-any.whl (10.5 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3041044 sha256=e95501df83d28728c713b746f51d59517144dd97207a2688a9115e8f87722b2e
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.94 botocore-1.29.94 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.7.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.0 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.28.0 google-cloud-videointelligence-2.11.0 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.7 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0318150157.1679152068.885489/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0318150157.1679152068.885489/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0318150157.1679152068.885489/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0318150157.1679152068.885489/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230318150748886505-1690'
 createTime: '2023-03-18T15:07:49.968705Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-18_08_07_49-9450084129099571119'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0318150157'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-18T15:07:49.968705Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-18_08_07_49-9450084129099571119]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-18_08_07_49-9450084129099571119
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-18_08_07_49-9450084129099571119?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-18_08_07_49-9450084129099571119 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:07:56.922Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:02.774Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.272Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.335Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.409Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.436Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.504Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.569Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.610Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.647Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.668Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.690Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.715Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.748Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.783Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.817Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.852Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.885Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.908Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.931Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:03.967Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:04.073Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:04.100Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:04.132Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:04.167Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:04.199Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-18_08_07_49-9450084129099571119 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:05.270Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:05.301Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:05.346Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:07.873Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:08:43.152Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:09:13.902Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:09:24.542Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:51:09.930Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:52:08.579Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T15:52:59.513Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T16:22:01.256Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T16:23:13.437Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T16:44:15.314Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T16:54:07.916Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T16:55:09.341Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T17:17:10.321Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T17:31:11.952Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T17:50:13.656Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T17:51:15.264Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T18:01:27.418Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T18:05:18.390Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T18:23:19.262Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T18:32:21.481Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T18:36:22.376Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T18:43:33.367Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T18:55:34.394Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T19:06:26.827Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T19:13:27.797Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T19:19:28.858Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T19:30:30.669Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T19:41:31.264Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T19:47:42.550Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T19:55:34.380Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-18_08_07_49-9450084129099571119 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T20:00:45.329Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-18_08_07_49-9450084129099571119.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T20:00:45.361Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T20:00:45.424Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T20:00:45.455Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T20:00:45.493Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-18T20:00:45.518Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-18_08_07_49-9450084129099571119?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 9s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/besouzxft2qne

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #960

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/960/display/redirect?page=changes>

Changes:

[noreply] Disable code coverage on IO_PreCommit tests (#26199)


------------------------------------------
[...truncated 24.61 KB...]
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting boto3>=1.9
  Using cached boto3-1.26.109-py3-none-any.whl (135 kB)
Collecting freezegun>=0.3.12
  Using cached freezegun-1.2.2-py3-none-any.whl (17 kB)
Collecting joblib>=1.0.1
  Using cached joblib-1.2.0-py3-none-any.whl (297 kB)
Collecting mock<6.0.0,>=1.0.1
  Using cached mock-5.0.1-py3-none-any.whl (30 kB)
Collecting pandas<2.0.0
  Using cached pandas-1.3.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.3 MB)
Collecting parameterized<0.9.0,>=0.7.1
  Using cached parameterized-0.8.1-py2.py3-none-any.whl (26 kB)
Collecting pyhamcrest!=1.10.0,<2.0.0,>=1.9
  Using cached PyHamcrest-1.10.1-py3-none-any.whl (48 kB)
Collecting pyyaml<7.0.0,>=3.12
  Using cached PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (596 kB)
Collecting requests_mock<2.0,>=1.7
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2
  Using cached pytest-7.3.0-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0
  Using cached pytest_xdist-3.2.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.4.47-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.71.0-py3-none-any.whl (406 kB)
Collecting azure-storage-blob>=12.3.2
  Using cached azure_storage_blob-12.15.0-py3-none-any.whl (387 kB)
Collecting azure-core>=1.7.0
  Using cached azure_core-1.26.4-py3-none-any.whl (173 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from azure-core>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.109
  Using cached botocore-1.29.109-py3-none-any.whl (10.6 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.0)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: importlib-metadata>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Requirement already satisfied: pluggy<2.0,>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3055124 sha256=dcf19331122c2e2b2ebff4b5cad84881fa7c807a979e2c1ee7a323c3b95f01ea
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-22.2.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.109 botocore-1.29.109 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.30.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.71.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.0 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "/home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/sdks/python/build/apache-beam.tar.gz" to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0410125353.1681139268.674692/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0410125353.1681139268.674692/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0410125353.1681139268.674692/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0410125353.1681139268.674692/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230410150748675681-9749'
 createTime: '2023-04-10T15:07:49.699766Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-10_08_07_49-7882248721090782608'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0410125353'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-10T15:07:49.699766Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-10_08_07_49-7882248721090782608]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-10_08_07_49-7882248721090782608
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-10_08_07_49-7882248721090782608?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-10_08_07_49-7882248721090782608 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:07:54.448Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:00.994Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:01.706Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:01.769Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:01.850Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:01.878Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:01.944Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:01.985Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.016Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.043Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.079Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.102Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.125Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.158Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.179Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.200Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.234Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.267Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.299Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.333Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.355Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.447Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.484Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.518Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.539Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.561Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.726Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.757Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:02.787Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-10_08_07_49-7882248721090782608 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:11.638Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:40.391Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:08:40.411Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:09:07.889Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:09:10.118Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:09:13.246Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:09:15.613Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:41:54.325Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:42:54.977Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T15:43:55.953Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-10T16:06:56.831Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-6' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285)
	at com.sun.proxy.$Proxy133.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207)
	at hudson.Launcher$ProcStarter.join(Launcher.java:524)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814)
	at hudson.model.Build$BuildExecution.build(Build.java:199)
	at hudson.model.Build$BuildExecution.doRun(Build.java:164)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522)
	at hudson.model.Run.execute(Run.java:1896)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44)
	at hudson.model.ResourceController.execute(ResourceController.java:101)
	at hudson.model.Executor.run(Executor.java:442)
Caused by: java.io.IOException: Unexpected termination of the channel
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:75)
Caused by: java.io.EOFException
	at java.base/java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2911)
	at java.base/java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:3406)
	at java.base/java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:932)
	at java.base/java.io.ObjectInputStream.<init>(ObjectInputStream.java:375)
	at hudson.remoting.ObjectInputStreamEx.<init>(ObjectInputStreamEx.java:49)
	at hudson.remoting.Command.readFrom(Command.java:142)
	at hudson.remoting.Command.readFrom(Command.java:128)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:35)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-6 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_Combine_Dataflow_Streaming #1026

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1026/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1025

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1025/display/redirect>

Changes:


------------------------------------------
[...truncated 29.08 KB...]
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.11.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.2-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.78.2-py3-none-any.whl (416 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.153 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.153-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.1-py2.py3-none-any.whl (224 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201909 sha256=c34d32a9124e89e10e4fdd8731f10e8a9581aea89f93bef215782fa319c1c7cb
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.1 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.153 botocore-1.29.153 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.20.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.1 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.1 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.78.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.2 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0614125356.1686755268.165640/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0614125356.1686755268.165640/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0614125356.1686755268.165640/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0614125356.1686755268.165640/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230614150748166753-8349'
 createTime: '2023-06-14T15:07:49.185557Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-14_08_07_48-12832743795784628884'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0614125356'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-14T15:07:49.185557Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-14_08_07_48-12832743795784628884]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-14_08_07_48-12832743795784628884
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-14_08_07_48-12832743795784628884?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-14_08_07_48-12832743795784628884 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:57.410Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.497Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.533Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.601Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.659Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.700Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.747Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.797Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.829Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.868Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.888Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.916Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.942Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.975Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.006Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.040Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.074Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.105Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.137Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.168Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.203Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.284Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.314Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.354Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.424Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.465Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.665Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.702Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.725Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-14_08_07_48-12832743795784628884 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:08:23.802Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:08:40.125Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:08:40.159Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:08:49.911Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:09:12.601Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:09:23.430Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:26:56.223Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:57:23.925Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T16:21:34.995Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T16:46:30.026Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T17:05:27.475Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T17:30:28.660Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T17:56:33.219Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T18:20:35.787Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T18:45:33.348Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T19:10:35.134Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T19:35:46.268Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-14_08_07_48-12832743795784628884 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.620Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-14_08_07_48-12832743795784628884.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.685Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.738Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.762Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.898Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.913Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-14_08_07_48-12832743795784628884?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 10s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/davxaab3q32nc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1024

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1024/display/redirect>

Changes:


------------------------------------------
[...truncated 29.12 KB...]
  Using cached google_cloud_pubsublite-1.7.0-py2.py3-none-any.whl (273 kB)
Collecting google-cloud-bigquery<4,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery-3.11.0-py2.py3-none-any.whl (219 kB)
Collecting google-cloud-bigquery-storage<3,>=2.6.3 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery_storage-2.20.0-py2.py3-none-any.whl (190 kB)
Collecting google-cloud-core<3,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<2.18.0,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.36.0-py2.py3-none-any.whl (332 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.2-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.152 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.152-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.1-py2.py3-none-any.whl (224 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201909 sha256=6e28f4a228c3cc30c003126a4bc58ad1c7641e19d399a067a9663ad72ef36af9
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.152 botocore-1.29.152 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.1 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.78.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.2 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0613125350.1686668861.047711/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0613125350.1686668861.047711/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0613125350.1686668861.047711/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0613125350.1686668861.047711/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230613150741048835-9768'
 createTime: '2023-06-13T15:07:43.892764Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-13_08_07_42-16500879349966031697'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0613125350'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-13T15:07:43.892764Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-13_08_07_42-16500879349966031697]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-13_08_07_42-16500879349966031697
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-13_08_07_42-16500879349966031697?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-13_08_07_42-16500879349966031697 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:01.060Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:07.660Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.306Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.379Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.459Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.491Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.549Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.605Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.650Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.690Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.725Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.759Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.786Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.823Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.853Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.886Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.920Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.030Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.061Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.098Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.210Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.245Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.277Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.316Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.355Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.547Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.591Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.655Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:12.125Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-13_08_07_42-16500879349966031697 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:56.744Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:56.776Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:09:06.628Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:09:28.787Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:09:39.335Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:22:54.539Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:54:11.391Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T16:20:12.155Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T16:43:13.674Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T17:09:25.111Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T17:35:16.484Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T17:55:28.793Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T18:14:20.272Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T18:39:22.370Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T18:58:23.316Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T19:23:24.903Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T19:49:26.483Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-13_08_07_42-16500879349966031697 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.758Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-13_08_07_42-16500879349966031697.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.807Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.868Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.889Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.910Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.933Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-13_08_07_42-16500879349966031697?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 15s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/x2hzbqgo5frya

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1023

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1023/display/redirect?page=changes>

Changes:

[noreply] Adding error tags in BigQuery Write Transforms (#27020)


------------------------------------------
[...truncated 29.19 KB...]
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.78.1-py3-none-any.whl (416 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.151 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.151-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201909 sha256=e90e236b6deb960228fd47beed149069ff90a2c3d585d9543a7208eddba03138
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.151 botocore-1.29.151 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.78.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.2 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0612125353.1686582475.376874/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0612125353.1686582475.376874/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0612125353.1686582475.376874/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0612125353.1686582475.376874/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230612150755377841-2334'
 createTime: '2023-06-12T15:07:56.501013Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-12_08_07_56-5142297964075332076'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0612125353'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-12T15:07:56.501013Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-12_08_07_56-5142297964075332076]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-12_08_07_56-5142297964075332076
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-12_08_07_56-5142297964075332076?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-12_08_07_56-5142297964075332076 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:07:59.754Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.763Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.785Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.848Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.908Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.936Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.988Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.054Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.092Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.118Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.147Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.181Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.219Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.251Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.273Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.302Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.326Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.372Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.395Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.416Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.500Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.529Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.553Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-12_08_07_56-5142297964075332076 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.598Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.636Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.806Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.838Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.893Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:33.913Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:09:41.127Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:09:41.154Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:09:56.283Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:10:03.660Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:10:58.581Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:10:58.609Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:11:18.860Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:34:18.861Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:39:29.428Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T16:03:27.090Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T16:28:32.651Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T16:53:39.826Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T17:19:40.941Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T17:39:32.132Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T18:04:33.286Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T18:29:34.185Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T18:55:35.318Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T19:21:36.542Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-12_08_07_56-5142297964075332076 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:07.935Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-12_08_07_56-5142297964075332076.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:07.976Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:08.024Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:08.045Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:08.073Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:08.097Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-12_08_07_56-5142297964075332076?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 8s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/gsxvqjtcjp5vq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1022

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1022/display/redirect?page=changes>

Changes:

[noreply] Add required commands to allowlist_externals in tox.ini (#27089)


------------------------------------------
[...truncated 129.48 KB...]
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:27:58.057Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:28:28.809Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:28:59.320Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:29:29.932Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:30:00.667Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:30:31.390Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:31:02.141Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:31:27.663Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:31:59.116Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:32:29.654Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:33:00.284Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:39:10.618Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-11_08_07_44-17811166637571296847 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-11_08_07_44-17811166637571296847?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/h5cuf2bvfk3dc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1021

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1021/display/redirect>

Changes:


------------------------------------------
[...truncated 28.98 KB...]
Collecting google-cloud-core<3,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<2.18.0,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.36.0-py2.py3-none-any.whl (332 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.2-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Collecting boto3<2,>=1.9 (from apache-beam==2.49.0.dev0)
  Using cached boto3-1.26.151-py3-none-any.whl (135 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.151 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.151-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201909 sha256=544ebcd4c7bdff7a23ef45b4fc4ea8c0cdd095c0d9fd9f92ca9dd99ad93e44ae
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.151 botocore-1.29.151 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.77.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0610125353.1686409650.670823/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0610125353.1686409650.670823/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0610125353.1686409650.670823/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0610125353.1686409650.670823/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230610150730672162-7951'
 createTime: '2023-06-10T15:07:31.897805Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-10_08_07_31-1241553607541181454'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0610125353'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-10T15:07:31.897805Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-10_08_07_31-1241553607541181454]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-10_08_07_31-1241553607541181454
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-10_08_07_31-1241553607541181454?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-10_08_07_31-1241553607541181454 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:35.323Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.715Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.832Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.896Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.965Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.993Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.057Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.102Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.130Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.159Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.176Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.197Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.228Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.261Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.293Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.325Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.357Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.390Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.421Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.454Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.488Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.578Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.604Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.636Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.666Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.698Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.879Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.915Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.955Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:46.284Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-10_08_07_31-1241553607541181454 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:08:22.508Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:08:54.941Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:09:05.175Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:11:07.390Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:41:35.317Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T16:06:37.822Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T16:31:48.840Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T16:56:38.749Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T17:21:39.796Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T18:04:41.206Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T18:24:52.234Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T18:50:43.297Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T19:15:44.433Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T19:39:45.299Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-10_08_07_31-1241553607541181454 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.800Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-10_08_07_31-1241553607541181454.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.859Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.921Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.938Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.969Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.992Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-10_08_07_31-1241553607541181454?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 29s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/qedwmwadmwdw2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1020

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1020/display/redirect>

Changes:


------------------------------------------
[...truncated 29.12 KB...]
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.77.0-py3-none-any.whl (416 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.150 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.150-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201884 sha256=a3ad9c824bc5c6dd87b26d1eaa62ab44e50528f5c8ef834aa96a3fdc1e8a99ba
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.150 botocore-1.29.150 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.77.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0609130237.1686323325.906078/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0609130237.1686323325.906078/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0609130237.1686323325.906078/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0609130237.1686323325.906078/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230609150845907209-9873'
 createTime: '2023-06-09T15:08:47.315547Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-09_08_08_46-1785475662426921351'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0609130237'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-09T15:08:47.315547Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-09_08_08_46-1785475662426921351]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-09_08_08_46-1785475662426921351
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-09_08_08_46-1785475662426921351?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-09_08_08_46-1785475662426921351 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:08:57.725Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:03.994Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:09.002Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.109Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.169Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.189Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.247Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.303Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.343Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.374Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.400Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.429Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.460Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.492Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.523Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.568Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.599Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.630Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.662Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.695Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.797Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.823Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.851Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.880Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.898Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:13.058Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:13.086Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:13.135Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:15.499Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-09_08_08_46-1785475662426921351 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:56.693Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:10:26.149Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:10:33.699Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:15:31.245Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:39:36.605Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T16:05:37.926Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T16:25:41.272Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T16:51:41.030Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T17:17:52.051Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T17:36:42.941Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T17:55:53.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T18:20:44.690Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T18:53:45.867Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T19:18:47.475Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T19:43:48.566Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:08:59.901Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-09_08_08_46-1785475662426921351 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.119Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-09_08_08_46-1785475662426921351.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.158Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.202Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.229Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.249Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.274Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-09_08_08_46-1785475662426921351?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 26m 9s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/5qmptm54mhxua

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1019

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1019/display/redirect>

Changes:


------------------------------------------
[...truncated 32.34 KB...]
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.11.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.76.0-py3-none-any.whl (414 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.149 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.149-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3202652 sha256=5d8295a4cdc03aaa4ae9f22a436b7504f45fe952b7a7032b1d32f858eeb5f69d
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.149 botocore-1.29.149 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.76.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0608131119.1686236896.884726/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0608131119.1686236896.884726/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0608131119.1686236896.884726/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0608131119.1686236896.884726/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230608150816885778-5417'
 createTime: '2023-06-08T15:08:17.975826Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-08_08_08_17-12562312136469905881'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0608131119'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-08T15:08:17.975826Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-08_08_08_17-12562312136469905881]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-08_08_08_17-12562312136469905881
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-08_08_08_17-12562312136469905881?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-08_08_08_17-12562312136469905881 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:22.938Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.739Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.771Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.834Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.893Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.940Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.987Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.044Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.084Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.111Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.146Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.178Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.201Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.234Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.265Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.300Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.333Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.354Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.385Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.409Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.431Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.509Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.537Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.561Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.596Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.620Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:29.100Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:29.128Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:29.158Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-08_08_08_17-12562312136469905881 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:54.924Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:09:23.886Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:09:23.908Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:09:33.725Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:09:57.719Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:10:08.421Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:20:06.823Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:50:37.860Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T16:33:43.389Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T16:56:40.586Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T17:21:41.975Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T17:47:42.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T18:12:43.693Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T18:39:44.993Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T19:04:46.347Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T19:29:57.566Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T19:54:49.238Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-08_08_08_17-12562312136469905881 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.084Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-08_08_08_17-12562312136469905881.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.105Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.155Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.181Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.205Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.227Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-08_08_08_17-12562312136469905881?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 8s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/fqx32dbltplh4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1018

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1018/display/redirect>

Changes:


------------------------------------------
[...truncated 32.33 KB...]
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.76.0-py3-none-any.whl (414 kB)
Collecting boto3<2,>=1.9 (from apache-beam==2.49.0.dev0)
  Using cached boto3-1.26.148-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.148 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.148-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3199731 sha256=39032d395e20967c971d1d21f9d3237770036bf94d8351711e598b5e88f51b75
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.148 botocore-1.29.148 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.76.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0607125432.1686150469.748015/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0607125432.1686150469.748015/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0607125432.1686150469.748015/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0607125432.1686150469.748015/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230607150749749050-6760'
 createTime: '2023-06-07T15:07:50.816354Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-07_08_07_50-4886773571697503226'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0607125432'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-07T15:07:50.816354Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-07_08_07_50-4886773571697503226]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-07_08_07_50-4886773571697503226
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-07_08_07_50-4886773571697503226?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-07_08_07_50-4886773571697503226 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:02.805Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:08.968Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:08.995Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.055Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.125Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.165Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.229Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.285Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.313Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.341Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.364Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.386Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.411Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.447Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.472Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.496Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.523Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.550Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.584Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.608Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.641Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.757Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.788Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.810Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.838Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.863Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:10.019Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:10.056Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:10.090Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-07_08_07_50-4886773571697503226 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:32.615Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:09:02.663Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:09:35.103Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:09:45.387Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:10:00.629Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:41:20.262Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T16:06:17.427Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T16:31:19.078Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T16:56:21.067Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T17:21:22.125Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T17:47:23.279Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T18:13:44.883Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T18:38:26.072Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T19:03:26.671Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T19:29:27.874Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T19:54:29.367Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-07_08_07_50-4886773571697503226 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.741Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-07_08_07_50-4886773571697503226.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.766Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.827Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.852Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.874Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.896Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-07_08_07_50-4886773571697503226?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 20s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/2pkbvtv67jvhm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1017

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1017/display/redirect>

Changes:


------------------------------------------
[...truncated 32.53 KB...]
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.76.0-py3-none-any.whl (414 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.147 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.147-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3199731 sha256=8ea00cca39ed6fb00a6c771d17068be7a6edcbcc5b6771776b7ddad4699c2bbc
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.147 botocore-1.29.147 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.76.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0606131707.1686064903.630362/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0606131707.1686064903.630362/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0606131707.1686064903.630362/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0606131707.1686064903.630362/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230606152143632368-9794'
 createTime: '2023-06-06T15:21:46.791860Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-06_08_21_46-16943315462094997206'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0606131707'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-06T15:21:46.791860Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-06_08_21_46-16943315462094997206]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-06_08_21_46-16943315462094997206
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-06_08_21_46-16943315462094997206?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-06_08_21_46-16943315462094997206 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:49.561Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:50.865Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:50.883Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:50.954Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.013Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.043Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.111Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.180Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.234Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.264Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.291Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.319Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.342Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.380Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.411Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.443Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.477Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.510Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.537Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.601Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.698Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.721Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.750Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.787Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.819Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-06_08_21_46-16943315462094997206 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.993Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:52.016Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:52.065Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:22:13.674Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:22:35.687Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:23:10.828Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:23:21.329Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:45:01.346Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T16:15:14.980Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T16:40:21.434Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T17:05:19.293Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T17:24:20.206Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T17:43:31.897Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:02:23.470Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:22:24.670Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:42:15.829Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:01:27.233Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:35:29.995Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:54:30.692Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-06_08_21_46-16943315462094997206 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.083Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-06_08_21_46-16943315462094997206.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.111Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.154Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.179Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.208Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.224Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-06_08_21_46-16943315462094997206?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 43m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/kiemsgqsjo2l4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1016

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1016/display/redirect?page=changes>

Changes:

[noreply] Removing an unnecessary dependency (#27001)


------------------------------------------
[...truncated 32.75 KB...]
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.2-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.146 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.146-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3088274 sha256=a2f6f355b217e9c5b044b2315991a05d77f09a08f1c4163e5d7c35d1332582e1
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.146 botocore-1.29.146 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.76.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0605125355.1685977697.088578/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0605125355.1685977697.088578/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0605125355.1685977697.088578/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0605125355.1685977697.088578/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230605150817089670-9495'
 createTime: '2023-06-05T15:08:18.313892Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-05_08_08_17-8750074020061385336'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0605125355'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-05T15:08:18.313892Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-05_08_08_17-8750074020061385336]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-05_08_08_17-8750074020061385336
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-05_08_08_17-8750074020061385336?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-05_08_08_17-8750074020061385336 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:30.897Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.716Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.751Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.824Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.922Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.961Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.031Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.104Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.167Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.222Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.258Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.282Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.315Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.459Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.541Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.642Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.706Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.783Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.857Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.888Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.930Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.134Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.205Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.249Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.296Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.353Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.588Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.661Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.772Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-05_08_08_17-8750074020061385336 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:39.377Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:09:19.928Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:10:10.205Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:10:20.262Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:11:29.842Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:16:12.901Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:16:41.095Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:40:35.712Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:05:37.212Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:29:43.667Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:42:17.326Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:42:28.827Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:49:39.580Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T17:08:41.423Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T17:27:43.168Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T17:46:43.817Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T18:06:55.312Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T18:26:47.045Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T18:46:47.939Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T19:05:48.770Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T19:25:50.362Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T19:45:51.293Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:10:52.960Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.522Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-05_08_08_17-8750074020061385336.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.570Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.652Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.678Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.712Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.736Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-05_08_08_17-8750074020061385336 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-05_08_08_17-8750074020061385336?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 16m 14s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/sbqrcbnfp545g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1015

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1015/display/redirect>

Changes:


------------------------------------------
[...truncated 32.31 KB...]
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.75.9-py3-none-any.whl (414 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.146 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.146-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3088274 sha256=ba997a4e1ff71ae6cbb6a1e639f5013e0fa5d921874292fae1cb1ce61ae65b21
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.146 botocore-1.29.146 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.9 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0604125354.1685891271.678458/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0604125354.1685891271.678458/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0604125354.1685891271.678458/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0604125354.1685891271.678458/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230604150751680141-6082'
 createTime: '2023-06-04T15:07:53.062987Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-04_08_07_52-17522196373556568990'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0604125354'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-04T15:07:53.062987Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-04_08_07_52-17522196373556568990]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-04_08_07_52-17522196373556568990
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-04_08_07_52-17522196373556568990?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-04_08_07_52-17522196373556568990 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:57.610Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.640Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.669Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.718Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.783Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.798Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.852Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.894Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.939Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.972Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.994Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.025Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.057Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.091Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.125Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.147Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.179Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.256Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.280Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.311Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.397Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.444Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.476Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.500Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.521Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.713Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.752Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.780Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-04_08_07_52-17522196373556568990 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:08:31.588Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:08:48.571Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:09:24.885Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:09:25.782Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:09:35.002Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:41:52.599Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T16:03:54.426Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T16:28:55.672Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T16:53:56.795Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T17:20:57.843Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T17:45:59.136Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T18:11:00.834Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T18:36:03.034Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T18:39:25.785Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T19:08:05.676Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T19:33:07.184Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T19:57:08.535Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-04_08_07_52-17522196373556568990 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.781Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-04_08_07_52-17522196373556568990.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.815Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.880Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.904Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.929Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.950Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-04_08_07_52-17522196373556568990?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/s62cdvvlbcsje

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1014

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1014/display/redirect>

Changes:


------------------------------------------
[...truncated 32.23 KB...]
  Using cached google_cloud_pubsub-2.17.1-py2.py3-none-any.whl (265 kB)
Collecting google-cloud-pubsublite<2,>=1.2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_pubsublite-1.7.0-py2.py3-none-any.whl (273 kB)
Collecting google-cloud-bigquery<4,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery-3.11.0-py2.py3-none-any.whl (219 kB)
Collecting google-cloud-bigquery-storage<3,>=2.6.3 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery_storage-2.20.0-py2.py3-none-any.whl (190 kB)
Collecting google-cloud-core<3,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<2.18.0,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.35.1-py2.py3-none-any.whl (332 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.2-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.146 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.146-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3088274 sha256=c0508a2041b0add787568225bafa42e0853e624c9105cb222cb8972ed1974861
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.146 botocore-1.29.146 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.9 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0603125346.1685804872.311847/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0603125346.1685804872.311847/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0603125346.1685804872.311847/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0603125346.1685804872.311847/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230603150752312864-9714'
 createTime: '2023-06-03T15:07:53.534900Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-03_08_07_52-12972041187723029300'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0603125346'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-03T15:07:53.534900Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-03_08_07_52-12972041187723029300]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-03_08_07_52-12972041187723029300
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-03_08_07_52-12972041187723029300?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-03_08_07_52-12972041187723029300 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:07:58.769Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.285Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.312Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.360Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.417Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.441Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.504Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.553Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.587Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.613Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.641Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.710Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.730Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.763Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.799Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.823Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.906Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.940Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.972Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.997Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.095Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.137Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.158Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.179Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.206Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.384Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.413Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.441Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-03_08_07_52-12972041187723029300 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:25.684Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:41.767Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:09:16.792Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:09:27.440Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:15:36.081Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:44:55.552Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T16:11:07.115Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T16:36:58.406Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T17:02:01.058Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T17:24:02.951Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T17:49:04.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T18:16:15.641Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T18:41:07.200Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T19:06:07.966Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T19:32:09.562Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T19:57:10.629Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.058Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-03_08_07_52-12972041187723029300.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.086Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.148Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.172Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.188Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.207Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-03_08_07_52-12972041187723029300 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-03_08_07_52-12972041187723029300?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 30s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/vzzzggkvismtw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1013

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1013/display/redirect>

Changes:


------------------------------------------
[...truncated 32.20 KB...]
Collecting pyyaml<7.0.0,>=3.12 (from apache-beam==2.49.0.dev0)
  Using cached PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (596 kB)
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.75.9-py3-none-any.whl (414 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.145 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.145-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080678 sha256=1f82aadbd8d9986f596c792cd10232e0b6595f7a068c4443f703d111fb1028d6
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.145 botocore-1.29.145 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.9 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0602125348.1685718470.573914/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0602125348.1685718470.573914/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0602125348.1685718470.573914/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0602125348.1685718470.573914/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230602150750575017-4669'
 createTime: '2023-06-02T15:07:51.794282Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-02_08_07_51-2028071261042916753'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0602125348'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-02T15:07:51.794282Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-02_08_07_51-2028071261042916753]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-02_08_07_51-2028071261042916753
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-02_08_07_51-2028071261042916753?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-02_08_07_51-2028071261042916753 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:56.751Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.064Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.099Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.179Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.246Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.269Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.332Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.402Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.435Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.472Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.504Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.533Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.568Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.598Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.623Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.648Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.679Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.701Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.724Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.758Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.790Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.899Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.935Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.965Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.997Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:59.031Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:59.209Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:59.245Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:59.287Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-02_08_07_51-2028071261042916753 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:08:16.420Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:08:29.081Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:08:39.962Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:09:29.981Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:09:40.211Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:38:29.081Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T16:49:56.509Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T17:15:01.340Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T17:39:58.517Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T18:04:59.796Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T18:30:14.606Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T18:55:13.512Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T19:20:05.868Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T19:45:07.790Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-02_08_07_51-2028071261042916753 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.789Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-02_08_07_51-2028071261042916753.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.831Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.948Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.967Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.990Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:09.012Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-02_08_07_51-2028071261042916753?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/wtv7wtfupepxa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1012

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1012/display/redirect>

Changes:


------------------------------------------
[...truncated 35.14 KB...]
  Using cached botocore-1.29.144-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Downloading docker-6.1.3-py3-none-any.whl (148 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 148.1/148.1 kB 4.0 MB/s eta 0:00:00
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080577 sha256=65d8042fef003b9cc6c478d443222ec20285be7fee834277a52fab1bdff1b580
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.144 botocore-1.29.144 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.9 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.14 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0601132530.1685632462.782840/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0601132530.1685632462.782840/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0601132530.1685632462.782840/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0601132530.1685632462.782840/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230601151422784358-5369'
 createTime: '2023-06-01T15:14:25.418356Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-01_08_14_24-4537209592619833224'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0601132530'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-01T15:14:25.418356Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-01_08_14_24-4537209592619833224]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-01_08_14_24-4537209592619833224
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-01_08_14_24-4537209592619833224?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-01_08_14_24-4537209592619833224 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:29.890Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.137Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.166Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.282Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.383Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.423Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.509Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.579Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.752Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.808Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.849Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.883Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.907Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.939Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.964Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.037Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.066Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.104Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.150Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.197Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.312Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.397Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.437Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.470Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.531Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.789Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.831Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.909Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-01_08_14_24-4537209592619833224 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:53.266Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:24.277Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:24.298Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:34.181Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:34.220Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:44.220Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:54.271Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:16:01.511Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:39:29.799Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:09:57.704Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:11:01.240Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:35:59.250Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:37:04.237Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:38:05.253Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:03:03.270Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:04:04.158Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:06:05.894Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:29:08.143Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:43:09.646Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:55:10.161Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T18:17:11.773Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T18:21:16.524Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T18:42:13.805Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T18:47:18.384Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:07:20.098Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:12:17.430Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:32:22.045Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:40:23.235Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:58:22.064Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T20:07:23.854Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T20:33:25.980Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T20:34:26.756Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T20:58:29.171Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T21:00:54.947Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T21:22:42.492Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T21:46:44.540Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T22:13:35.918Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T22:30:40.879Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T22:39:38.220Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T22:56:39.392Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:05:41.339Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:22:42.225Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:31:44.192Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:48:48.968Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:57:49.779Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-01_08_14_24-4537209592619833224 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.459Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-01_08_14_24-4537209592619833224.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.508Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.602Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.628Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.652Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.683Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-01_08_14_24-4537209592619833224?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8h 54m 43s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/y64bqcn4axjn4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1011

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1011/display/redirect>

Changes:


------------------------------------------
[...truncated 32.48 KB...]
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.0-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.75.7-py3-none-any.whl (414 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.143 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.143-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080480 sha256=16f31d59580e0e6adee3f9e5be3828c3111663af7d1a52dc0d90cadfd5a3faad
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.143 botocore-1.29.143 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.0 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.7 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.14 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0531125357.1685545672.792908/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0531125357.1685545672.792908/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0531125357.1685545672.792908/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0531125357.1685545672.792908/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230531150752793909-6553'
 createTime: '2023-05-31T15:07:53.797366Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-31_08_07_53-2063206424136828092'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0531125357'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-31T15:07:53.797366Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-31_08_07_53-2063206424136828092]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-31_08_07_53-2063206424136828092
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-31_08_07_53-2063206424136828092?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-31_08_07_53-2063206424136828092 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:01.027Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.341Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.419Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.492Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.564Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.602Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.658Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.733Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.780Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.812Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.842Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.864Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.896Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.918Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.953Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:02.987Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.024Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.055Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.087Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.120Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.156Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.277Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.329Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.363Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.399Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.433Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.629Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.655Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:03.721Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-31_08_07_53-2063206424136828092 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:23.381Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:53.641Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:08:56.771Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:11:47.187Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:11:54.133Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T15:39:27.761Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T16:06:29.065Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T16:30:34.340Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T16:54:32.018Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T17:13:38.043Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T17:32:45.896Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T17:57:37.894Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T18:17:42.782Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T18:49:50.148Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T19:14:41.829Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T19:33:42.857Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T19:58:47.847Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T20:00:51.629Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-31_08_07_53-2063206424136828092.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T20:00:51.656Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T20:00:51.704Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T20:00:51.729Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T20:00:51.780Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-31T20:00:51.798Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-31_08_07_53-2063206424136828092 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-31_08_07_53-2063206424136828092?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 23s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/6vcvuwep5xl6y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1010

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1010/display/redirect?page=changes>

Changes:

[noreply] Bump cloud.google.com/go/pubsub from 1.30.1 to 1.31.0 in /sdks (#26888)


------------------------------------------
[...truncated 32.35 KB...]
  Using cached google_cloud_pubsublite-1.7.0-py2.py3-none-any.whl (273 kB)
Collecting google-cloud-bigquery<4,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery-3.10.0-py2.py3-none-any.whl (218 kB)
Collecting google-cloud-bigquery-storage<3,>=2.6.3 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery_storage-2.19.1-py2.py3-none-any.whl (190 kB)
Collecting google-cloud-core<3,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<2.18.0,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.35.0-py2.py3-none-any.whl (331 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.9.1-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.142 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.142-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080441 sha256=e30829687d05568a5f7be37ebc3c12d919e4c185f0dcd55d3632cc230e1521a8
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.142 botocore-1.29.142 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.6 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.14 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0530125354.1685459268.396914/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0530125354.1685459268.396914/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0530125354.1685459268.396914/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0530125354.1685459268.396914/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230530150748399147-1640'
 createTime: '2023-05-30T15:07:52.205283Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-30_08_07_48-12774470686206562279'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0530125354'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-30T15:07:52.205283Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-30_08_07_48-12774470686206562279]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-30_08_07_48-12774470686206562279
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-30_08_07_48-12774470686206562279?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-30_08_07_48-12774470686206562279 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:56.688Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.244Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.284Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.344Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.415Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.444Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.488Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.552Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.611Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.639Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.670Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.733Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.762Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.791Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.821Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.854Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.882Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.922Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.944Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.966Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:07:59.994Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:00.088Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:00.136Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:00.163Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:00.200Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:00.231Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:00.417Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:00.490Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:00.520Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-30_08_07_48-12774470686206562279 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:09.874Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:08:51.095Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:09:22.494Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:09:32.706Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:29:49.791Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T15:58:24.343Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T16:24:29.805Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T16:46:27.859Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T17:05:29.008Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T17:25:30.272Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T17:58:35.489Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T18:17:36.798Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T18:42:37.545Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T19:08:38.642Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T19:28:49.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T19:53:36.597Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T19:54:48.021Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-30_08_07_48-12774470686206562279 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T20:01:02.514Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-30_08_07_48-12774470686206562279.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T20:01:02.551Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T20:01:02.628Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T20:01:02.653Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T20:01:02.690Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-30T20:01:02.716Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-30_08_07_48-12774470686206562279?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 6s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/wysgv5yemntd2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1009

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1009/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #26919: warn if BigQuery failed rows collection is


------------------------------------------
[...truncated 32.22 KB...]
  Using cached google_cloud_pubsub-2.17.1-py2.py3-none-any.whl (265 kB)
Collecting google-cloud-pubsublite<2,>=1.2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_pubsublite-1.7.0-py2.py3-none-any.whl (273 kB)
Collecting google-cloud-bigquery<4,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery-3.10.0-py2.py3-none-any.whl (218 kB)
Collecting google-cloud-bigquery-storage<3,>=2.6.3 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery_storage-2.19.1-py2.py3-none-any.whl (190 kB)
Collecting google-cloud-core<3,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<2.18.0,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.35.0-py2.py3-none-any.whl (331 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.9.1-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.142 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.142-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080441 sha256=443efe33492cb70e405f371a8c811cea4793f33cb5e2d976528b10eef779c71c
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.142 botocore-1.29.142 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.6 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.14 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0529125347.1685372875.095906/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0529125347.1685372875.095906/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0529125347.1685372875.095906/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0529125347.1685372875.095906/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230529150755098279-3881'
 createTime: '2023-05-29T15:07:56.637137Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-29_08_07_55-6303628460666151418'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0529125347'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-29T15:07:56.637137Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-29_08_07_55-6303628460666151418]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-29_08_07_55-6303628460666151418
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-29_08_07_55-6303628460666151418?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-29_08_07_55-6303628460666151418 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:00.821Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.109Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.142Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.205Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.268Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.294Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.368Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.436Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.476Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.515Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.555Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.580Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.601Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.643Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.670Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.699Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.739Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.773Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.858Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:02.900Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:03.003Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:03.061Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:03.099Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:03.132Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:03.169Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:03.358Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:03.394Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:03.448Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-29_08_07_55-6303628460666151418 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:31.421Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:08:43.402Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:09:15.078Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:09:25.340Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:30:15.940Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T15:59:06.401Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T16:24:02.190Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T16:49:05.513Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T17:14:06.783Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T17:39:04.076Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T18:04:15.365Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T18:31:17.044Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T18:50:08.162Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T19:11:12.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T19:55:20.769Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-29_08_07_55-6303628460666151418 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T20:01:19.405Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-29_08_07_55-6303628460666151418.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T20:01:19.439Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T20:01:19.505Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T20:01:19.534Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T20:01:19.557Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-29T20:01:19.578Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-29_08_07_55-6303628460666151418?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 23s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/ezydaz24huspi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1008

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1008/display/redirect>

Changes:


------------------------------------------
[...truncated 32.54 KB...]
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-40.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.75.6-py3-none-any.whl (413 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.142 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.142-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080441 sha256=4552d4cce40b77a543a7e3c9c9b7144b6515214cf21152521c15ddfa85ea2212
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.142 botocore-1.29.142 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.6 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.14 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0528125345.1685286470.303034/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0528125345.1685286470.303034/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0528125345.1685286470.303034/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0528125345.1685286470.303034/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230528150750305467-2325'
 createTime: '2023-05-28T15:07:51.564262Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-28_08_07_50-13068319796460049365'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0528125345'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-28T15:07:51.564262Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-28_08_07_50-13068319796460049365]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-28_08_07_50-13068319796460049365
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-28_08_07_50-13068319796460049365?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-28_08_07_50-13068319796460049365 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:56.085Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.449Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.481Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.551Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.609Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.633Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.690Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.742Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.830Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.883Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.905Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.957Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:57.989Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.044Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.066Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.090Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.114Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.136Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.212Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.260Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.283Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.307Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.332Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.542Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.562Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:07:58.611Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-28_08_07_50-13068319796460049365 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:08:06.076Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:08:37.246Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:08:37.263Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:08:47.046Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:09:08.470Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:09:20.721Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:26:02.572Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T15:54:23.231Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T16:38:26.372Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T17:03:25.409Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T17:28:29.999Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T17:52:31.001Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T17:56:02.573Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T18:21:29.535Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T18:46:30.966Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T19:05:32.312Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T19:30:33.774Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T19:55:34.966Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-28_08_07_50-13068319796460049365 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T20:00:58.215Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-28_08_07_50-13068319796460049365.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T20:00:58.252Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T20:00:58.329Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T20:00:58.353Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T20:00:58.435Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-28T20:00:58.459Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-28_08_07_50-13068319796460049365?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/6l2jz5zhxl3ve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1007

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1007/display/redirect>

Changes:


------------------------------------------
[...truncated 32.26 KB...]
  Using cached PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (596 kB)
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-40.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.75.6-py3-none-any.whl (413 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.142 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.142-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080441 sha256=129f11e0868af518505d3bdeacbcb7cf07cfc004ba39e32e100c00d24ac6f12f
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.142 botocore-1.29.142 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.6 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.14 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0527125354.1685200076.202459/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0527125354.1685200076.202459/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0527125354.1685200076.202459/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0527125354.1685200076.202459/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230527150756204970-1893'
 createTime: '2023-05-27T15:07:57.462812Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-27_08_07_56-1160520824984903496'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0527125354'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-27T15:07:57.462812Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-27_08_07_56-1160520824984903496]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-27_08_07_56-1160520824984903496
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-27_08_07_56-1160520824984903496?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-27_08_07_56-1160520824984903496 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:00.941Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.369Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.400Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.454Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.512Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.541Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.598Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.661Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.701Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.727Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.748Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.782Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.829Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.861Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.895Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.928Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.961Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:02.995Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.018Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.041Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.134Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.174Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.195Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.227Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.255Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.431Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.465Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:03.525Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-27_08_07_56-1160520824984903496 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:14.730Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:08:41.381Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:09:13.863Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:09:24.858Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:31:08.855Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:32:08.631Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T15:59:06.991Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T16:28:18.341Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T16:54:19.116Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T17:13:09.889Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T17:38:10.850Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T17:58:11.723Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T18:18:17.006Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T19:22:14.973Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T19:47:26.213Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-27_08_07_56-1160520824984903496 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T20:01:06.944Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-27_08_07_56-1160520824984903496.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T20:01:06.973Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T20:01:07.015Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T20:01:07.031Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T20:01:07.059Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-27T20:01:07.085Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-27_08_07_56-1160520824984903496?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 9s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/7xsfgtzqaqh7a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Python_Combine_Dataflow_Streaming - Build # 1006 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Python_Combine_Dataflow_Streaming - Build # 1006 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1006/ to view the results.

Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1005

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1005/display/redirect>

Changes:


------------------------------------------
[...truncated 33.73 KB...]
  Using cached azure_core-1.26.4-py3-none-any.whl (173 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.140 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.140-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080441 sha256=ea1975de3b1dc5bd22b9c8e5ecb9bd276c5b589f6f8b91665c0f113ff9d2364f
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.140 botocore-1.29.140 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.13 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0525132109.1685027290.914203/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0525132109.1685027290.914203/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0525132109.1685027290.914203/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0525132109.1685027290.914203/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230525150810916561-3200'
 createTime: '2023-05-25T15:08:12.155641Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-25_08_08_11-6055489963498988871'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0525132109'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-25T15:08:12.155641Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-25_08_08_11-6055489963498988871]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-25_08_08_11-6055489963498988871
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-25_08_08_11-6055489963498988871?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-25_08_08_11-6055489963498988871 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:17.281Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:23.397Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.378Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.444Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.505Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.532Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.586Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.629Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.670Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.696Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.727Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.758Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.793Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.824Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.856Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.881Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.904Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.956Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:25.979Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:26Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:26.098Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:26.125Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:26.155Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:26.186Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:26.210Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:26.369Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:26.405Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:26.454Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-25_08_08_11-6055489963498988871 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:08:28.085Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:09:07.961Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:09:43.428Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:09:51.990Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:20:56.013Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T15:51:22.116Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T16:01:23.282Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T16:05:45.124Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T16:26:46.361Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T16:27:28.281Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T16:43:29.324Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T16:59:30.290Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T17:02:41.843Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T17:04:32.828Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T17:18:33.840Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T17:32:35.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T17:35:36.181Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T17:41:37.154Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T17:52:38.449Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T18:04:39.426Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T18:09:11.063Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T18:15:42.196Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T18:27:44.014Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T18:34:49.908Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T18:37:45.849Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T18:50:46.682Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T19:01:57.632Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T19:04:48.863Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T19:08:50.186Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T19:24:55.291Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T19:29:52.543Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T19:34:05.734Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T19:36:58.320Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T19:56:00.667Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T19:59:12.737Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T20:00:41.233Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-25_08_08_11-6055489963498988871.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T20:00:41.263Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T20:00:41.337Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T20:00:41.354Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T20:00:41.383Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-25T20:00:41.409Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-25_08_08_11-6055489963498988871 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-25_08_08_11-6055489963498988871?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 12s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/xzhbzkcqqflqu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1004

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1004/display/redirect>

Changes:


------------------------------------------
[...truncated 34.28 KB...]
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.139 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.139-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3077051 sha256=c52b6608fe82b9df0f870e3325155214b5f6814f9f737d0eb3d1ff8aa664cc1a
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.139 botocore-1.29.139 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.13 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0524125348.1684940869.718434/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0524125348.1684940869.718434/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0524125348.1684940869.718434/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0524125348.1684940869.718434/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230524150749720743-1982'
 createTime: '2023-05-24T15:07:51.192766Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-24_08_07_50-17168189588363713913'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0524125348'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-24T15:07:51.192766Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-24_08_07_50-17168189588363713913]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-24_08_07_50-17168189588363713913
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-24_08_07_50-17168189588363713913?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-24_08_07_50-17168189588363713913 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:54.953Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.064Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.099Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.173Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.230Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.259Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.311Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.352Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.390Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.423Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.454Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.485Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.522Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.543Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.576Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.597Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.618Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.654Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.679Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.702Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.735Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.822Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.863Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.887Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.910Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:58.935Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:59.117Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:59.158Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:07:59.207Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-24_08_07_50-17168189588363713913 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:08:34.436Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:08:52.150Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:08:52.179Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:09:21.288Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:09:30.794Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:09:42.015Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:12:11.197Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:53:23.988Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:53:24.024Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:54:29.797Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:54:29.814Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T15:57:39.806Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T16:23:57.951Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T16:23:57.978Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T16:24:08.028Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T16:31:39.236Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T16:43:30.772Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T16:54:31.499Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T16:55:32.399Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T16:58:34.918Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T17:19:44.197Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T17:22:37.471Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T17:26:45.457Z: JOB_MESSAGE_ERROR: generic::aborted: SDK harness sdk-0-0 disconnected. This usually means that the process running the pipeline code has crashed. Inspect the Worker Logs and the Diagnostics tab to determine the cause of the crash.
with MessageCode:
(25234d49bec01979): SDK disconnect.
passed through:
==>
    dist_proc/dax/workflow/****/fnapi_control_service.cc:217
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T17:33:49.361Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T17:44:41.536Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T17:53:42.618Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T17:57:44.037Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T18:07:45.352Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T18:18:47.195Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T18:27:51.987Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T18:30:49.170Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T18:44:51.486Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T18:46:03.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T19:02:17.746Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T19:04:55.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T19:18:06.479Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T19:24:57.746Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T19:30:59.102Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T19:41:01.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T19:44:03.077Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-24T19:51:17.807Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-24_08_07_50-17168189588363713913 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-24_08_07_50-17168189588363713913?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 21s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/khsj5c4o7vxkm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1003

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1003/display/redirect>

Changes:


------------------------------------------
[...truncated 32.91 KB...]
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.34.0-py2.py3-none-any.whl (331 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.9.1-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting boto3<2,>=1.9 (from apache-beam==2.49.0.dev0)
  Using cached boto3-1.26.138-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.138 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.138-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.55.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3077051 sha256=128b6374c4a47c1b6a540fcd14cf58f9ed0c74b1003b7b6ad9b62986211a7053
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.138 botocore-1.29.138 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.34.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.55.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0523125345.1684854474.141459/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0523125345.1684854474.141459/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0523125345.1684854474.141459/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0523125345.1684854474.141459/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230523150754143976-6763'
 createTime: '2023-05-23T15:07:55.163479Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-23_08_07_54-16406262625876831656'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0523125345'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-23T15:07:55.163479Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-23_08_07_54-16406262625876831656]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-23_08_07_54-16406262625876831656
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-23_08_07_54-16406262625876831656?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-23_08_07_54-16406262625876831656 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:00.253Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:01.783Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:01.820Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:01.897Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:01.966Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.007Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.049Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.117Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.197Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.232Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.263Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.286Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.330Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.364Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.391Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.426Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.450Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.502Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.524Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.550Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.573Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.699Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.740Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.773Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.806Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:02.831Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:03.022Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:03.055Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:03.146Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-23_08_07_54-16406262625876831656 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:32.940Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:42.272Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:42.294Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:08:52.161Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:09:17.411Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:09:29.085Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:13:52.648Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:53:56.028Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:55:37.884Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T15:58:49.505Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T16:20:10.680Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T16:28:02.468Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T16:34:07.235Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T16:57:04.380Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T17:03:09.737Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T17:35:37.288Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T17:53:49.542Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T18:12:14.220Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T18:20:11.304Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T18:38:12.363Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T19:14:17.392Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T19:50:15.392Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T20:16:31.769Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T20:43:29.892Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T20:47:25.514Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-23_08_07_54-16406262625876831656.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T20:47:25.547Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T20:47:25.612Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T20:47:25.629Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T20:47:25.652Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-23T20:47:25.676Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-23_08_07_54-16406262625876831656 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-23_08_07_54-16406262625876831656?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 42m
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/b2mimcuzu5lqy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1002

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1002/display/redirect>

Changes:


------------------------------------------
[...truncated 32.96 KB...]
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.34.0-py2.py3-none-any.whl (331 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.9.1-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.137 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.137-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080833 sha256=d53df17018513840c5049aa33d391095469ad859994cb45f0c7aedf6d28f028a
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.137 botocore-1.29.137 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.34.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0522142447.1684768080.442690/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0522142447.1684768080.442690/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0522142447.1684768080.442690/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0522142447.1684768080.442690/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230522150800443742-5712'
 createTime: '2023-05-22T15:08:01.587670Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-22_08_08_01-3934459487637599205'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0522142447'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-22T15:08:01.587670Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-22_08_08_01-3934459487637599205]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-22_08_08_01-3934459487637599205
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-22_08_08_01-3934459487637599205?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-22_08_08_01-3934459487637599205 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:06.747Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.358Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.377Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.446Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.501Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.530Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.596Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.677Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.730Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.759Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.792Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.825Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.858Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.893Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.957Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:10.991Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.026Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.058Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.092Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.128Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.211Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.250Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.273Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.303Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.335Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.506Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.540Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:11.575Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-22_08_08_01-3934459487637599205 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:40.543Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:08:53.774Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:09:30.930Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:09:39.714Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:14:42.773Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:45:10.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T15:46:14.020Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T16:10:17.755Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T16:11:14.896Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T16:13:16.402Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T16:37:21.053Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T16:45:21.809Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T17:02:19.864Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T17:12:22.020Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T17:27:23.194Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T17:39:24.361Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T18:02:25.437Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T18:06:27.476Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T18:27:29.089Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T18:35:29.914Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T19:01:31.237Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T19:03:43.550Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T19:25:34.728Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T19:53:36.160Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T19:54:47.244Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-22_08_08_01-3934459487637599205 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T20:00:50.738Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-22_08_08_01-3934459487637599205.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T20:00:50.767Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T20:00:50.881Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T20:00:50.903Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T20:00:50.932Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-22T20:00:50.957Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-22_08_08_01-3934459487637599205?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 13s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/nnaheuypio4zc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1001

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1001/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #26807 from liferoad/autoschema-known-issues


------------------------------------------
[...truncated 32.84 KB...]
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-40.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.75.3-py3-none-any.whl (413 kB)
Collecting boto3<2,>=1.9 (from apache-beam==2.49.0.dev0)
  Using cached boto3-1.26.137-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.137 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.137-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3073378 sha256=6fada129c57385237f64244949b6aa0e62f8c5376768afca4a26f5dd25e01822
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.137 botocore-1.29.137 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.34.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0521125345.1684681678.230378/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0521125345.1684681678.230378/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0521125345.1684681678.230378/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0521125345.1684681678.230378/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230521150758231442-6897'
 createTime: '2023-05-21T15:07:59.331777Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-21_08_07_58-12641507434024627018'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0521125345'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-21T15:07:59.331777Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-21_08_07_58-12641507434024627018]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-21_08_07_58-12641507434024627018
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-21_08_07_58-12641507434024627018?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-21_08_07_58-12641507434024627018 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:04.485Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.473Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.499Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.554Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.638Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.657Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.705Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.755Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.788Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.825Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.852Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.880Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.902Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.923Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.954Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:05.980Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.077Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.101Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.123Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.228Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.292Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.319Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.346Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.370Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.560Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.590Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:06.624Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-21_08_07_58-12641507434024627018 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:22.259Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:08:45.847Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:09:14.196Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:09:23.289Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T15:31:03.314Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T16:01:02.164Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T16:02:07.046Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T16:26:07.925Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T16:27:04.906Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T16:29:09.503Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T17:00:06.740Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T17:01:11.386Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T17:35:09.526Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T17:37:14.508Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T17:39:15.613Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T18:10:23.460Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T18:11:15.305Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T18:37:16.521Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T18:38:27.766Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T18:39:29.748Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T19:03:21.120Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T19:07:21.769Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T19:37:23.306Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T19:47:23.833Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-21_08_07_58-12641507434024627018 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T20:01:00.742Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-21_08_07_58-12641507434024627018.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T20:01:00.779Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T20:01:00.833Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T20:01:00.847Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T20:01:00.871Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-21T20:01:00.891Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-21_08_07_58-12641507434024627018?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 9s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/eek34xswvwec2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1000

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1000/display/redirect>

Changes:


------------------------------------------
[...truncated 33.89 KB...]
  Using cached hypothesis-6.75.3-py3-none-any.whl (413 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.137 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.137-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3073378 sha256=082948d5396b3f493156a71a8f15e712edeba921ff5fd2129a0988eff2904ee3
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.137 botocore-1.29.137 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.34.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0520125345.1684595275.057942/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0520125345.1684595275.057942/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0520125345.1684595275.057942/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0520125345.1684595275.057942/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230520150755059003-5056'
 createTime: '2023-05-20T15:07:56.263563Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-20_08_07_55-11668278993561578936'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0520125345'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-20T15:07:56.263563Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-20_08_07_55-11668278993561578936]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-20_08_07_55-11668278993561578936
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-20_08_07_55-11668278993561578936?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-20_08_07_55-11668278993561578936 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:01.439Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:03.985Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.010Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.067Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.138Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.168Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.234Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.294Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.323Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.353Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.376Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.428Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.464Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.509Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.540Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.576Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.610Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.642Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.668Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.710Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.748Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.845Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.886Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.918Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.956Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:04.986Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:05.209Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:05.256Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:05.286Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-20_08_07_55-11668278993561578936 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:23.853Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:08:46.785Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:09:29.690Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:09:38.375Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:15:33.078Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:48:21.022Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T15:56:01.660Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T16:00:03.145Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T16:20:08.627Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T16:32:09.475Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T16:35:56.515Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T16:56:11.372Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T17:00:13.088Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T17:14:10.084Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T17:18:11.594Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T17:27:13.018Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T17:38:13.954Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T17:47:15.600Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T17:51:16.753Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T18:03:17.776Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T18:04:19.497Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T18:13:22Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T18:18:23.476Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T18:32:24.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T18:36:25.632Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T18:42:26.490Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T18:52:27.787Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T18:59:28.899Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T19:02:29.808Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T19:08:31.296Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T19:19:32.796Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T19:25:33.852Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T19:27:35.646Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T19:36:46.563Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T19:47:38.892Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T19:56:41.360Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T19:57:42.958Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T20:00:36.894Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-20_08_07_55-11668278993561578936.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T20:00:36.925Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T20:00:36.985Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T20:00:37.008Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T20:00:37.041Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-20T20:00:37.069Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-20_08_07_55-11668278993561578936 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-20_08_07_55-11668278993561578936?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 13s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/vveq6qjq4zkzu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #999

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/999/display/redirect>

Changes:


------------------------------------------
[...truncated 32.86 KB...]
  Using cached google_cloud_language-2.9.1-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.26.4-py3-none-any.whl (173 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Collecting boto3<2,>=1.9 (from apache-beam==2.49.0.dev0)
  Using cached boto3-1.26.136-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.136 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.136-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3073378 sha256=3915476b583ffdc05a54ed04ad84abf8800427c2450427220475c53ad6cf3301
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.136 botocore-1.29.136 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.34.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0519125355.1684508897.194481/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0519125355.1684508897.194481/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0519125355.1684508897.194481/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0519125355.1684508897.194481/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230519150817195524-8120'
 createTime: '2023-05-19T15:08:18.326956Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-19_08_08_17-9101277076475977609'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0519125355'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-19T15:08:18.326956Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-19_08_08_17-9101277076475977609]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-19_08_08_17-9101277076475977609
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-19_08_08_17-9101277076475977609?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-19_08_08_17-9101277076475977609 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:23.951Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:24.993Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.024Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.077Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.145Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.165Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.214Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.258Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.288Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.320Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.350Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.414Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.441Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.463Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.485Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.515Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.548Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.579Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.609Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.641Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.666Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.759Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.791Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.819Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.842Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:25.875Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:26.037Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:26.065Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:26.097Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-19_08_08_17-9101277076475977609 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:08:41.469Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:09:09.521Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:09:37.651Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:09:47.867Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:10:25.560Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:43:24.895Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:44:26.386Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T15:45:27.047Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T16:10:28.576Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T16:14:29.944Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T16:45:31.369Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T17:03:32.619Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T17:16:34.158Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T17:32:35.156Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T17:34:37.007Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T18:05:38.410Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T18:13:39.098Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T18:32:50.577Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T18:43:41.502Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T18:59:42.427Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T19:08:43.323Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T19:28:44.422Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T19:55:45.809Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-19_08_08_17-9101277076475977609 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T20:00:44.901Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-19_08_08_17-9101277076475977609.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T20:00:44.978Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T20:00:45.036Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T20:00:45.060Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T20:00:45.089Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-19T20:00:45.114Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-19_08_08_17-9101277076475977609?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 13s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/llabidqsjdyei

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #998

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/998/display/redirect>

Changes:


------------------------------------------
[...truncated 33.57 KB...]
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting boto3<2,>=1.9 (from apache-beam==2.49.0.dev0)
  Using cached boto3-1.26.135-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.135 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.135-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3069404 sha256=59e680b726f355a02c4e97c54526001329bef6ea69acbd4812af85052311e9b3
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.135 botocore-1.29.135 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.34.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.0 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0518125343.1684422490.877896/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0518125343.1684422490.877896/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0518125343.1684422490.877896/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0518125343.1684422490.877896/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230518150810878896-6824'
 createTime: '2023-05-18T15:08:11.952704Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-18_08_08_11-11448134124049629824'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0518125343'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-18T15:08:11.952704Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-18_08_08_11-11448134124049629824]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-18_08_08_11-11448134124049629824
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-18_08_08_11-11448134124049629824?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-18_08_08_11-11448134124049629824 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:18.203Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.445Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.494Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.561Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.645Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.680Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.746Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.809Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.849Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.886Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.920Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.952Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:19.984Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.017Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.050Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.084Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.106Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.127Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.154Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.188Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.222Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.317Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.344Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.374Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.406Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.428Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.515Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.533Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:20.583Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-18_08_08_11-11448134124049629824 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:08:44.487Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:09:12.047Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:09:43.399Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:09:55.698Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:13:39.718Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:52:39.920Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:53:44.520Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:55:41.686Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T15:57:46.764Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T16:21:54.978Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T16:23:46.942Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T16:25:51.924Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T16:47:50.401Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T16:49:51.346Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T16:58:52.765Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T17:14:54.069Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T17:19:55.742Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T17:27:57.867Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T17:41:59.247Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T17:49:01.552Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T17:58:02.509Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T18:09:03.978Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T18:17:15.948Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T18:27:06.881Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T18:37:08.003Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T18:43:09.302Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T18:55:11.226Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T19:05:22.102Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T19:12:14.407Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T19:24:15.990Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T19:33:16.808Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T19:41:17.871Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T19:52:19.001Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-18_08_08_11-11448134124049629824 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T20:01:19.758Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-18_08_08_11-11448134124049629824.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T20:01:19.841Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T20:01:19.888Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T20:01:19.910Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T20:01:19.936Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-18T20:01:19.955Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-18_08_08_11-11448134124049629824?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 19s
15 actionable tasks: 15 executed

Publishing build scan...
https://ge.apache.org/s/t24r7bguqplgy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #997

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/997/display/redirect>

Changes:


------------------------------------------
[...truncated 33.31 KB...]
Collecting boto3<2,>=1.9 (from apache-beam==2.48.0.dev0)
  Using cached boto3-1.26.135-py3-none-any.whl (135 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.48.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.48.0.dev0)
  Using cached azure_core-1.26.4-py3-none-any.whl (173 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.48.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.135 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.135-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3069405 sha256=cce0d5bfbf3a1208cbd37e83fdb049ce0875d3c40ef3792880fa45a00408b378
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
  Attempting uninstall: protobuf
    Found existing installation: protobuf 4.23.0
    Uninstalling protobuf-4.23.0:
      Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.7.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.135 botocore-1.29.135 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.34.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.5 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.0 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0517125344.1684336080.903929/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0517125344.1684336080.903929/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0517125344.1684336080.903929/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0517125344.1684336080.903929/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230517150800904900-7495'
 createTime: '2023-05-17T15:08:01.932196Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-17_08_08_01-11090810857374413370'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0517125344'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-17T15:08:01.932196Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-17_08_08_01-11090810857374413370]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-17_08_08_01-11090810857374413370
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-17_08_08_01-11090810857374413370?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-17_08_08_01-11090810857374413370 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:06.697Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.279Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.298Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.355Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.410Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.436Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.492Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.553Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.582Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.606Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.629Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.655Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.670Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.695Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.725Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.746Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.770Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.792Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.814Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.835Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.932Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.952Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.973Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:09.996Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:10.023Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:10.177Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:10.192Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:10.214Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-17_08_08_01-11090810857374413370 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:19.470Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:50.699Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:08:50.718Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:09:00.490Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:09:26.496Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:09:36.195Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T15:33:54.271Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T16:06:14.898Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T16:07:15.073Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T16:08:16.123Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T16:34:27.189Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T16:43:18.339Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T17:01:22.940Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T17:12:20.018Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T17:28:20.912Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T17:41:21.884Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T17:56:22.617Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T18:12:24.864Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T18:25:26.034Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T18:38:36.744Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T18:53:28.416Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T19:04:29.583Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T19:20:30.492Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T19:32:41.953Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T19:48:32.939Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T20:00:33.916Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-17_08_08_01-11090810857374413370 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T20:00:57.205Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-17_08_08_01-11090810857374413370.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T20:00:57.227Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T20:00:57.280Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T20:00:57.303Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T20:00:57.325Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-17T20:00:57.348Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-17_08_08_01-11090810857374413370?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 29s
15 actionable tasks: 15 executed

Publishing build scan...
https://ge.apache.org/s/y72yzf6tskrp6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #996

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/996/display/redirect>

Changes:


------------------------------------------
[...truncated 34.68 KB...]
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.134 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.134-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable!=2.18.0,<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3069099 sha256=ca2b08a3ec91bd7ce74b178dfe514b41644dc6b701fe0c19c19012f3a4443566
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
  Attempting uninstall: protobuf
    Found existing installation: protobuf 4.23.0
    Uninstalling protobuf-4.23.0:
      Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.7.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.134 botocore-1.29.134 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.18.1 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.34.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.5 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.0 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0516130024.1684249750.239727/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0516130024.1684249750.239727/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0516130024.1684249750.239727/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0516130024.1684249750.239727/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230516150910241508-2933'
 createTime: '2023-05-16T15:09:11.540410Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-16_08_09_11-13718298765911795807'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0516130024'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-16T15:09:11.540410Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-16_08_09_11-13718298765911795807]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-16_08_09_11-13718298765911795807
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-16_08_09_11-13718298765911795807?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-16_08_09_11-13718298765911795807 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:16.399Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:17.738Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:17.769Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:17.833Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:17.895Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:17.923Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:17.990Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.046Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.090Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.126Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.150Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.180Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.236Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.257Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.279Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.311Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.345Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.368Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.400Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.432Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.523Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.566Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.597Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.620Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.649Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.829Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.855Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:18.889Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-16_08_09_11-13718298765911795807 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:09:52.233Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:10:10.079Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:10:37.926Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:10:47.557Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:17:38.972Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:46:38.062Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:50:39.070Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T15:57:40.485Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T16:15:41.417Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T16:24:42.138Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T16:30:43.800Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T16:39:45.411Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T16:51:56.909Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T16:52:48.432Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T16:58:49.830Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T17:08:50.884Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T17:20:52.005Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T17:21:53.602Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T17:25:55.360Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T17:35:56.369Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T17:44:58.520Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T17:50:59.513Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T17:54:00.522Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T18:10:02.664Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T18:11:05.287Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T18:19:06.548Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T18:23:07.686Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T18:37:09.203Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T18:38:10.897Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T18:47:11.855Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T18:51:12.972Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T19:03:14.241Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T19:04:15.475Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T19:15:17.103Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T19:18:18.311Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T19:30:20.626Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T19:38:21.671Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T19:42:22.561Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T19:45:23.626Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T19:54:25.392Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-16_08_09_11-13718298765911795807 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T20:01:07.663Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-16_08_09_11-13718298765911795807.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T20:01:07.711Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T20:01:07.758Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T20:01:07.778Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T20:01:07.809Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-16T20:01:07.826Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-16_08_09_11-13718298765911795807?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 25s
15 actionable tasks: 15 executed

Publishing build scan...
https://ge.apache.org/s/viawnbp2u3j6i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #995

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/995/display/redirect?page=changes>

Changes:

[noreply] [Tour of Beam] Learning content for "Schema-based Transforms" module


------------------------------------------
[...truncated 32.87 KB...]
Collecting scikit-learn>=0.20.0 (from apache-beam==2.48.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.48.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.48.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.48.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.48.0.dev0)
  Using cached cryptography-40.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.48.0.dev0)
  Using cached hypothesis-6.75.3-py3-none-any.whl (413 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.133 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.133-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable!=2.18.0,<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3068363 sha256=bde4f5e6ec50edc13a741131d82e41e14fabb1ba7d3721c67ffeb8396ea35cc1
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
  Attempting uninstall: protobuf
    Found existing installation: protobuf 4.23.0
    Uninstalling protobuf-4.23.0:
      Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.7.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.133 botocore-1.29.133 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.5 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.0 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0515125408.1684163363.298320/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0515125408.1684163363.298320/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0515125408.1684163363.298320/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0515125408.1684163363.298320/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230515150923299823-8982'
 createTime: '2023-05-15T15:09:24.696899Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-15_08_09_24-4431721837781006786'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0515125408'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-15T15:09:24.696899Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-15_08_09_24-4431721837781006786]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-15_08_09_24-4431721837781006786
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-15_08_09_24-4431721837781006786?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-15_08_09_24-4431721837781006786 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:28.557Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:29.710Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:29.734Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:29.800Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:29.858Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:29.880Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:29.966Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.017Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.058Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.073Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.091Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.113Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.143Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.172Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.196Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.227Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.250Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.274Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.296Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.324Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.346Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.430Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.465Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.486Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.507Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.536Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.716Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.731Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:09:30.775Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-15_08_09_24-4431721837781006786 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:10:03.571Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:10:09.230Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:10:09.249Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:10:19.041Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:10:36.352Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:10:46.516Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T15:37:56.959Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T16:10:23.766Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T16:13:24.752Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T16:37:15.916Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T16:43:36.277Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T17:04:29.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T17:13:29.956Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T17:43:31.300Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T17:56:51.471Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T18:12:32.714Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T18:25:34.212Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T18:47:35.306Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T18:55:37.668Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T19:26:39.227Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T19:38:41.714Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T19:54:43.212Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-15_08_09_24-4431721837781006786 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T20:01:19.106Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-15_08_09_24-4431721837781006786.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T20:01:19.138Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T20:01:19.190Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T20:01:19.207Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T20:01:19.236Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-15T20:01:19.263Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-15_08_09_24-4431721837781006786?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 15 executed

Publishing build scan...
https://ge.apache.org/s/3jvtmno4q3lbw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #994

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/994/display/redirect>

Changes:


------------------------------------------
[...truncated 32.91 KB...]
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.48.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.48.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.48.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.48.0.dev0)
  Using cached cryptography-40.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.48.0.dev0)
  Using cached hypothesis-6.75.3-py3-none-any.whl (413 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.133 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.133-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable!=2.18.0,<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3068363 sha256=9f95d40cb1044f0cd763b268acd879402288ab832e15593721ecc0d50eed0e1f
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
  Attempting uninstall: protobuf
    Found existing installation: protobuf 4.23.0
    Uninstalling protobuf-4.23.0:
      Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.7.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.133 botocore-1.29.133 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.3 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.5 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.0 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0514125344.1684076885.218582/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0514125344.1684076885.218582/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0514125344.1684076885.218582/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0514125344.1684076885.218582/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230514150805219611-7186'
 createTime: '2023-05-14T15:08:06.311367Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-14_08_08_05-13712922427807446315'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0514125344'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-14T15:08:06.311367Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-14_08_08_05-13712922427807446315]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-14_08_08_05-13712922427807446315
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-14_08_08_05-13712922427807446315?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-14_08_08_05-13712922427807446315 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:10.124Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.300Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.352Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.406Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.452Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.469Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.524Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.570Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.610Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.643Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.672Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.706Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.736Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.768Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.793Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.815Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.836Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.901Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.932Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.965Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:11.997Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:12.081Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:12.118Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:12.140Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:12.175Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:12.206Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:12.384Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:12.413Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:12.435Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-14_08_08_05-13712922427807446315 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:26.948Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:08:50.995Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:09:22.181Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:09:31.798Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T15:33:22.038Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T16:06:03.999Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T16:07:05.044Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T16:34:06.303Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T16:35:06.786Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T16:42:07.635Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T17:02:08.740Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T17:03:10.038Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T17:32:11.213Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T17:33:12.030Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T18:01:13.161Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T18:03:13.597Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T18:29:14.622Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T18:33:15.457Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T18:56:16.434Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T19:05:17.282Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T19:24:18.371Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T19:43:19.057Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T19:53:20.052Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-14_08_08_05-13712922427807446315 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T20:01:04.668Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-14_08_08_05-13712922427807446315.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T20:01:04.697Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T20:01:04.740Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T20:01:04.758Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T20:01:04.779Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-14T20:01:04.802Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-14_08_08_05-13712922427807446315?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 28s
15 actionable tasks: 15 executed

Publishing build scan...
https://ge.apache.org/s/krt6z3sv2pooc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #993

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/993/display/redirect>

Changes:


------------------------------------------
[...truncated 32.71 KB...]
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable!=2.18.0,<3,>=2.0.0 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_spanner-3.33.0-py2.py3-none-any.whl (328 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_language-2.9.1-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.133 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.133-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable!=2.18.0,<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3068363 sha256=724456daa5c9564e358c4a69e818a6fc6249e379fe8fc4346c2f9ee5fd90900e
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
  Attempting uninstall: protobuf
    Found existing installation: protobuf 4.23.0
    Uninstalling protobuf-4.23.0:
      Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.7.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.133 botocore-1.29.133 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.5 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.0 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0513125345.1683990481.905113/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0513125345.1683990481.905113/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0513125345.1683990481.905113/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0513125345.1683990481.905113/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230513150801906114-2018'
 createTime: '2023-05-13T15:08:03.109970Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-13_08_08_02-11008052639844548032'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0513125345'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-13T15:08:03.109970Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-13_08_08_02-11008052639844548032]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-13_08_08_02-11008052639844548032
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-13_08_08_02-11008052639844548032?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-13_08_08_02-11008052639844548032 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:06.487Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:07.944Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.063Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.122Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.163Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.213Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.286Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.334Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.359Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.396Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.428Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.463Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.497Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.521Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.543Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.569Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.605Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.641Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.670Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.693Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.789Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.824Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.857Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.890Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:08.927Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:09.157Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:09.185Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:09.231Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-13_08_08_02-11008052639844548032 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:39.099Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:08:49.158Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:09:21.372Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:09:33.462Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T15:32:28.950Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T16:12:02.736Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T16:13:13.586Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T16:14:05.452Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T16:49:07.192Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T17:18:08.394Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T17:28:07.839Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T17:51:09.002Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T18:06:09.964Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T18:25:11.011Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T18:43:12.003Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T19:02:12.890Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T19:19:23.804Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T19:39:14.741Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T19:55:15.450Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-13_08_08_02-11008052639844548032 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T20:01:10.424Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-13_08_08_02-11008052639844548032.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T20:01:10.460Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T20:01:10.531Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T20:01:10.562Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T20:01:10.597Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-13T20:01:10.620Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-13_08_08_02-11008052639844548032?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 24s
15 actionable tasks: 15 executed

Publishing build scan...
https://ge.apache.org/s/4gsjt3vdfgjce

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #992

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/992/display/redirect>

Changes:


------------------------------------------
[...truncated 33.34 KB...]
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.48.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.48.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.48.0.dev0)
  Using cached cryptography-40.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.48.0.dev0)
  Using cached hypothesis-6.75.2-py3-none-any.whl (413 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.133 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.133-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable!=2.18.0,<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.2-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3067926 sha256=a890d1ef8c4157cb89473fcc76335e83b82bc0e9197e524e30b030eeca22c029
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
  Attempting uninstall: protobuf
    Found existing installation: protobuf 4.23.0
    Uninstalling protobuf-4.23.0:
      Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.7.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.133 botocore-1.29.133 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.2 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.17.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.5 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0512125346.1683904090.461481/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0512125346.1683904090.461481/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0512125346.1683904090.461481/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0512125346.1683904090.461481/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230512150810462533-8075'
 createTime: '2023-05-12T15:08:11.623521Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-12_08_08_11-6711512923687460185'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0512125346'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-12T15:08:11.623521Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-12_08_08_11-6711512923687460185]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-12_08_08_11-6711512923687460185
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-12_08_08_11-6711512923687460185?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-12_08_08_11-6711512923687460185 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:16.610Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:17.875Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:17.901Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:17.988Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.061Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.088Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.156Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.218Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.266Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.307Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.330Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.365Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.396Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.430Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.455Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.483Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.517Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.549Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.580Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.610Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.646Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.747Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.784Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.819Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.844Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:18.884Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:19.058Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:19.085Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:19.150Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-12_08_08_11-6711512923687460185 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:53.974Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:08:58.593Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:09:30.792Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:09:43.811Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T15:24:47.170Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T16:05:14.131Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T16:06:15.653Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T16:07:17.411Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T16:39:18.846Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T16:43:21.026Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T17:07:22.252Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T17:13:22.852Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T17:21:23.825Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T17:41:24.823Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T17:47:25.886Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T17:58:27.031Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T18:15:29.023Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T18:23:29.865Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T18:34:30.679Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T18:50:32.546Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T18:58:33.537Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T19:08:34.257Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T19:24:36.346Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T19:35:37.871Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T19:43:38.745Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T19:58:41.722Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T20:00:54.896Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-12_08_08_11-6711512923687460185.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T20:00:54.921Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T20:00:54.969Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T20:00:54.991Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T20:00:55.019Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-12T20:00:55.043Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-12_08_08_11-6711512923687460185 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-12_08_08_11-6711512923687460185?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 26s
15 actionable tasks: 15 executed

Publishing build scan...
https://ge.apache.org/s/5e5ip4gudvr2k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #991

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/991/display/redirect>

Changes:


------------------------------------------
[...truncated 34.76 KB...]
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3067918 sha256=8607334b93d79b3311194f44dad5276cd2c4a130c4a1ee071526ae7306d37588
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
  Attempting uninstall: protobuf
    Found existing installation: protobuf 4.23.0
    Uninstalling protobuf-4.23.0:
      Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.7.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.132 botocore-1.29.132 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.18.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.18.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.5 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0510215142.1683817739.410498/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0510215142.1683817739.410498/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0510215142.1683817739.410498/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0510215142.1683817739.410498/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230511150859411518-1418'
 createTime: '2023-05-11T15:09:01.098981Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-11_08_09_00-6023389183075012921'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0510215142'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-11T15:09:01.098981Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-11_08_09_00-6023389183075012921]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-11_08_09_00-6023389183075012921
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-11_08_09_00-6023389183075012921?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-11_08_09_00-6023389183075012921 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:05.707Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:06.930Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:06.948Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.015Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.106Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.132Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.192Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.246Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.275Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.302Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.324Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.346Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.375Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.408Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.430Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.452Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.484Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.516Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.572Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.595Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.685Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.714Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.746Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.767Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.797Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.965Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:07.985Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:08.012Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-11_08_09_00-6023389183075012921 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:42.888Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:09:47.491Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:10:21.022Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:10:33.320Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:18:35.809Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:21:11.629Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:21:11.659Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:21:21.385Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:58:34.027Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T15:59:35.226Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T16:01:36.806Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T16:31:38.182Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T17:03:39.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T17:06:40.328Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T17:14:41.333Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T17:38:42.306Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T17:40:43.106Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T17:52:58.183Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T18:12:45.658Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T18:14:46.343Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T18:28:51.207Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T18:46:48.354Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T18:51:49.076Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T19:03:50.146Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T19:19:51.014Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T19:27:52.459Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T19:38:53.238Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T19:54:54.236Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-11_08_09_00-6023389183075012921 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T20:00:53.048Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-11_08_09_00-6023389183075012921.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T20:00:53.075Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T20:00:53.125Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T20:00:53.145Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T20:00:53.168Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-11T20:00:53.190Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-11_08_09_00-6023389183075012921?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 53s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:66)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:61)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://ge.apache.org/s/dyutifjdbkvje

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #990

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/990/display/redirect>

Changes:


------------------------------------------
[...truncated 34.20 KB...]
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.131 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.131-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-2.0.2-py3-none-any.whl (123 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3067088 sha256=83a112b9b187d3aaa82b87bd0c86b1cbccf7abbdf248f48b3f66badc55cf6628
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
  Attempting uninstall: protobuf
    Found existing installation: protobuf 4.23.0
    Uninstalling protobuf-4.23.0:
      Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.7.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.131 botocore-1.29.131 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.5 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683731266.287218/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683731266.287218/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683731266.287218/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683731266.287218/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230510150746288192-5108'
 createTime: '2023-05-10T15:07:47.435629Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-10_08_07_46-17990350135456302487'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0507185346'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-10T15:07:47.435629Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-10_08_07_46-17990350135456302487]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-10_08_07_46-17990350135456302487
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-10_08_07_46-17990350135456302487?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-10_08_07_46-17990350135456302487 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:52.456Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:53.769Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:53.804Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:53.890Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:53.952Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:53.988Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.055Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.122Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.162Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.190Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.222Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.243Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.277Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.310Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.342Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.365Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.399Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.432Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.466Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.497Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.530Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.636Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.675Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.707Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.741Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.764Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.930Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.959Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:07:54.989Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-10_08_07_46-17990350135456302487 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:08:02.760Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:08:40.265Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:09:14.753Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:09:27.384Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T15:20:13.590Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T16:01:53.144Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T16:03:54.607Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T16:04:55.734Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T16:33:57.161Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T16:35:08.094Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T17:07:59.943Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T17:09:06.121Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T17:26:07.901Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T17:44:05.937Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T17:45:10.820Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T18:19:08.004Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T18:56:13.423Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T19:32:10.562Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-10_08_07_46-17990350135456302487 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T20:01:13.507Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-10_08_07_46-17990350135456302487.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T20:01:13.550Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T20:01:13.624Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T20:01:13.690Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T20:01:13.716Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-10T20:01:13.735Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-10_08_07_46-17990350135456302487?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 19s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:66)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:61)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://ge.apache.org/s/majhd4lapr5ji

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #989

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/989/display/redirect>

Changes:


------------------------------------------
[...truncated 34.95 KB...]
  Downloading botocore-1.29.130-py3-none-any.whl (10.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.7/10.7 MB 80.7 MB/s eta 0:00:00
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-2.0.2-py3-none-any.whl (123 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3068516 sha256=36b9cf49eb8b09ae7e6b3b20554a4fb1d7ec9853b2b6b70744fccb842929b13e
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
  Attempting uninstall: protobuf
    Found existing installation: protobuf 4.23.0
    Uninstalling protobuf-4.23.0:
      Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.130 botocore-1.29.130 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.4 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683644949.147364/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683644949.147364/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683644949.147364/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683644949.147364/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230509150909148275-7901'
 createTime: '2023-05-09T15:09:10.442826Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-09_08_09_09-2319503586605036788'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0507185346'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-09T15:09:10.442826Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-09_08_09_09-2319503586605036788]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-09_08_09_09-2319503586605036788
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-09_08_09_09-2319503586605036788?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-09_08_09_09-2319503586605036788 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:20.215Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:26.648Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:31.671Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:32.988Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.051Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.070Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.121Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.176Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.209Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.231Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.254Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.286Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.318Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.349Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.376Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.405Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.432Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.458Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.486Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.507Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.533Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.631Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.664Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.700Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.734Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.767Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.923Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.959Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.991Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-09_08_09_09-2319503586605036788 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:36.181Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:10:13.538Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:10:45.353Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:10:57.974Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:30:43.484Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:10:26.895Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:13:31.873Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:15:33.103Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:38:30.637Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:44:35.462Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:47:32.758Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:52:35.191Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T17:00:43.485Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T17:41:43.173Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T17:45:44.881Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:03:34.286Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:03:41.564Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:15:57.662Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:23:58.862Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:50:00.112Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T19:01:03.262Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T19:40:54.589Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T19:56:14.654Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-09_08_09_09-2319503586605036788 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.103Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-09_08_09_09-2319503586605036788.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.128Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.167Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.188Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.217Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.232Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-09_08_09_09-2319503586605036788?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:66)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:61)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://ge.apache.org/s/ugnby6zk2xb34

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #988

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/988/display/redirect>

Changes:


------------------------------------------
[...truncated 33.89 KB...]
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.129 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.129-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-2.0.2-py3-none-any.whl (123 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3068524 sha256=77e26468709fd8bdb4b0af56dba0e839293d7074166c08df5729af726a5dd797
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.129 botocore-1.29.129 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683558463.816989/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683558463.816989/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683558463.816989/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683558463.816989/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230508150743817993-9713'
 createTime: '2023-05-08T15:07:45.183691Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-08_08_07_44-10023564486681231376'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0507185346'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-08T15:07:45.183691Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-08_08_07_44-10023564486681231376]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-08_08_07_44-10023564486681231376
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-08_08_07_44-10023564486681231376?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-08_08_07_44-10023564486681231376 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:49.309Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:50.746Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:50.779Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:50.867Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:50.934Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:50.963Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.028Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.103Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.147Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.183Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.212Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.233Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.254Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.287Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.310Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.332Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.354Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.385Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.406Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.427Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.451Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.529Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.569Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.605Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.627Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.659Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.843Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.879Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:07:51.932Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-08_08_07_44-10023564486681231376 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:08:23.502Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:08:30.939Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:08:31.927Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:11:21.683Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:11:33.764Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:50:44.508Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T15:53:45.574Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T16:08:31.927Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T16:47:47.396Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T16:50:01.207Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T17:25:52.389Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T17:51:50.119Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T18:04:52.043Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T18:27:53.109Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T18:41:55.238Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T19:01:56.928Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T19:18:03.860Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T19:37:41.641Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T19:55:14.048Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-08_08_07_44-10023564486681231376 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T20:01:50.246Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-08_08_07_44-10023564486681231376.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T20:01:50.312Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T20:01:50.376Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T20:01:50.396Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T20:01:50.417Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-08T20:01:50.439Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-08_08_07_44-10023564486681231376?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 56m 37s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:66)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:61)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://ge.apache.org/s/3kemayiugzqoe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #987

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/987/display/redirect>

Changes:


------------------------------------------
[...truncated 33.90 KB...]
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.129 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.129-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-2.0.2-py3-none-any.whl (123 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.0-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3067043 sha256=aae4a98a9601e14b2e642cebf0974b5cebd27fd6d3d89f288266f62ee9558e21
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.129 botocore-1.29.129 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.0 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507150205.1683472073.113969/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507150205.1683472073.113969/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507150205.1683472073.113969/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507150205.1683472073.113969/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230507150753115069-5944'
 createTime: '2023-05-07T15:07:54.297030Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-07_08_07_53-13746431071402200075'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0507150205'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-07T15:07:54.297030Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-07_08_07_53-13746431071402200075]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-07_08_07_53-13746431071402200075
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-07_08_07_53-13746431071402200075?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-07_08_07_53-13746431071402200075 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:07:59.157Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.538Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.575Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.652Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.722Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.757Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.822Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.888Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.934Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.970Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:00.998Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.018Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.051Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.082Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.114Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.146Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.179Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.204Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.231Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.265Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.287Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.392Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.434Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.462Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.494Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.526Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.711Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.740Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:01.789Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-07_08_07_53-13746431071402200075 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:28.494Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:08:40.647Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:09:12.407Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:09:24.563Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T15:24:06.517Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T16:03:06.264Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T16:04:07.143Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T16:06:08.198Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T16:41:10.021Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T16:44:11.005Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T17:15:13.188Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T17:23:14.198Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T17:49:15.302Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T18:01:17.179Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T18:24:18.088Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T18:39:20.745Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T18:58:26.242Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T19:17:34.011Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T19:36:26.155Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T19:55:28.171Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-07_08_07_53-13746431071402200075 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T20:00:59.636Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-07_08_07_53-13746431071402200075.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T20:00:59.667Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T20:00:59.705Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T20:00:59.718Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T20:00:59.739Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-07T20:00:59.760Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-07_08_07_53-13746431071402200075?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 9s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:66)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:61)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://ge.apache.org/s/x5awgkhrg23qe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #986

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/986/display/redirect>

Changes:


------------------------------------------
[...truncated 33.78 KB...]
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Collecting boto3<2,>=1.9 (from apache-beam==2.48.0.dev0)
  Using cached boto3-1.26.129-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.129 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.129-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-2.0.2-py3-none-any.whl (123 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.1.0-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3067043 sha256=c627dbda3f4b5b30a8e6010da1409be3717d34471c0d56b9edfce0208b47bee7
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.129 botocore-1.29.129 cachetools-5.3.0 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.0 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0506150206.1683385674.993630/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0506150206.1683385674.993630/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0506150206.1683385674.993630/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0506150206.1683385674.993630/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230506150754994652-5806'
 createTime: '2023-05-06T15:07:56.073172Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-06_08_07_55-16609557349435942859'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0506150206'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-06T15:07:56.073172Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-06_08_07_55-16609557349435942859]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-06_08_07_55-16609557349435942859
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-06_08_07_55-16609557349435942859?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-06_08_07_55-16609557349435942859 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:01.460Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:02.851Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:02.911Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:02.996Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.055Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.107Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.197Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.246Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.283Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.321Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.340Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.362Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.399Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.428Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.460Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.496Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.523Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.552Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.593Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.619Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.646Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.728Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.783Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.802Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.836Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:03.876Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:04.064Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:04.094Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:04.143Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-06_08_07_55-16609557349435942859 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:19.204Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:08:43.276Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:09:14.677Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:09:27.514Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T15:29:40.579Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T16:08:57.884Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T16:11:57.466Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T16:42:01.879Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T16:51:04.466Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T17:16:01.565Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T17:28:06.680Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T17:52:04.428Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T18:04:05.810Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T18:26:07.543Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T18:42:08.646Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T19:56:11.108Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T19:59:10.810Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T20:00:37.500Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-06_08_07_55-16609557349435942859.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T20:00:37.530Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T20:00:37.572Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T20:00:37.590Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T20:00:37.612Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-06T20:00:37.632Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-06_08_07_55-16609557349435942859 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-06_08_07_55-16609557349435942859?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 10s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:66)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:61)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://ge.apache.org/s/27vtemxzhwrb6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #985

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/985/display/redirect>

Changes:


------------------------------------------
[...truncated 34.28 KB...]
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.127 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.127-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-2.0.2-py3-none-any.whl (123 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3067043 sha256=caa7f515b5778791522131d185d359ba84c754546074e6d5d99f3bfd36b23235
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.127 botocore-1.29.127 cachetools-5.3.0 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0505150210.1683299274.430128/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0505150210.1683299274.430128/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0505150210.1683299274.430128/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0505150210.1683299274.430128/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230505150754431163-4074'
 createTime: '2023-05-05T15:07:55.632521Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-05_08_07_55-17728492910770275487'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0505150210'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-05T15:07:55.632521Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-05_08_07_55-17728492910770275487]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-05_08_07_55-17728492910770275487
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-05_08_07_55-17728492910770275487?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-05_08_07_55-17728492910770275487 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:01.309Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:02.752Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:02.790Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:02.842Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:02.906Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:02.931Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:02.999Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.047Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.098Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.123Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.157Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.191Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.220Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.254Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.282Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.309Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.341Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.379Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.412Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.444Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.477Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.581Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.611Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.640Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.680Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.713Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.899Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.930Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:03.960Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-05_08_07_55-17728492910770275487 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:20.313Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:08:47.770Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:09:16.752Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:09:29.209Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T15:22:45.479Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T16:04:02.090Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T16:05:04.403Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T16:31:06.166Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T16:37:06.286Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T16:39:07.158Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T17:45:18.380Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T17:49:13.611Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T18:16:15.433Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T18:20:12.910Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T18:26:14.923Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T18:49:16.394Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T18:54:17.224Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T19:04:28.395Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T19:23:23.515Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T19:31:20.578Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T19:41:31.781Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T19:57:23.720Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T20:01:22.801Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-05_08_07_55-17728492910770275487.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T20:01:22.831Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T20:01:22.909Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T20:01:22.936Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T20:01:22.963Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-05T20:01:22.991Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-05_08_07_55-17728492910770275487 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-05_08_07_55-17728492910770275487?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 55s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:66)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:61)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://ge.apache.org/s/boa3ftqjon3cm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #984

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/984/display/redirect>

Changes:


------------------------------------------
[...truncated 33.76 KB...]
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.126 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.126-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-2.0.2-py3-none-any.whl (123 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting urllib3<3,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3066340 sha256=882adc10642691f3071e8587d22d5007c0e7eb1c74ec6e16dec4f9a5cf37fb7d
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.126 botocore-1.29.126 cachetools-5.3.0 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0504150504.1683215491.097829/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0504150504.1683215491.097829/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0504150504.1683215491.097829/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0504150504.1683215491.097829/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230504155131098848-9652'
 createTime: '2023-05-04T15:51:32.275623Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-04_08_51_31-15560180432743871204'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0504150504'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-04T15:51:32.275623Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-04_08_51_31-15560180432743871204]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-04_08_51_31-15560180432743871204
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-04_08_51_31-15560180432743871204?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-04_08_51_31-15560180432743871204 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:36.482Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:37.628Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:37.664Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:37.742Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:37.815Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:37.837Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:37.909Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:37.954Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.003Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.031Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.063Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.094Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.130Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.161Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.194Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.221Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.252Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.276Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.295Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.329Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.362Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.459Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.509Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.532Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.568Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.604Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.796Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.825Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:51:38.889Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-04_08_51_31-15560180432743871204 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:52:07.434Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:52:07.494Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:52:10.697Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:52:16.982Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:52:47.692Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T15:52:59.848Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T16:10:53.272Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T16:51:02.102Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T16:52:12.924Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T16:54:04.152Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T17:25:05.637Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T17:26:06.967Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T17:59:08.601Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T18:00:09.918Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T18:16:11.137Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T18:35:22.771Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T18:38:14.647Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T19:10:16.066Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T19:47:20.654Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-04_08_51_31-15560180432743871204 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T20:01:05.754Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-04_08_51_31-15560180432743871204.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T20:01:05.800Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T20:01:05.873Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T20:01:05.899Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T20:01:05.923Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-04T20:01:05.941Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-04_08_51_31-15560180432743871204?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 12m 24s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/ptoxlaalkfzwu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #983

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/983/display/redirect>

Changes:


------------------------------------------
[...truncated 33.82 KB...]
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.125 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.125-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3065471 sha256=32d46453b20f63a7eb3614609902a69d311b3f66bf7b05bbe05b3383f2c1d725
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.125 botocore-1.29.125 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.75.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.4 requests-2.29.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0503125358.1683127499.880805/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0503125358.1683127499.880805/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0503125358.1683127499.880805/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0503125358.1683127499.880805/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230503152459882524-6396'
 createTime: '2023-05-03T15:25:01.298562Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-03_08_25_00-2088380597768479911'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0503125358'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-03T15:25:01.298562Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-03_08_25_00-2088380597768479911]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-03_08_25_00-2088380597768479911
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-03_08_25_00-2088380597768479911?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-03_08_25_00-2088380597768479911 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:06.256Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.139Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.174Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.226Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.311Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.346Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.411Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.457Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.491Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.523Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.556Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.587Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.622Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.656Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.678Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.709Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.745Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.781Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.805Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.835Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.868Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:08.975Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:09.015Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:09.059Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:09.087Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:09.113Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:09.272Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:09.301Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:09.325Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-03_08_25_00-2088380597768479911 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:44.289Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:53.477Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:25:53.514Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:26:03.112Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:26:26.360Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:26:38.756Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T15:42:32.841Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T16:22:34.348Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T16:23:33.451Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T16:25:34.589Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T17:00:36.231Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T17:04:37.898Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T17:33:39.292Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T17:43:54.616Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T18:07:42.212Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T18:22:43.974Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T18:43:45.774Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T18:58:50.095Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T19:19:48.223Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T19:34:59.600Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T19:54:51.517Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-03_08_25_00-2088380597768479911 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T20:01:12.665Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-03_08_25_00-2088380597768479911.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T20:01:13.896Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T20:01:13.964Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T20:01:13.990Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T20:01:14.027Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-03T20:01:14.055Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-03_08_25_00-2088380597768479911?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 38m 48s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/nkgbabjggga5o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #982

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/982/display/redirect>

Changes:


------------------------------------------
[...truncated 33.79 KB...]
Collecting boto3<2,>=1.9 (from apache-beam==2.48.0.dev0)
  Using cached boto3-1.26.124-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.124 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.124-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3064876 sha256=dfa4e85dc7f0da34a5a5c832cc761c891fbb4ac51eabf4076637fa78b561a74b
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.124 botocore-1.29.124 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.32.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.75.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.29.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0502150202.1683040077.126228/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0502150202.1683040077.126228/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0502150202.1683040077.126228/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0502150202.1683040077.126228/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230502150757127226-4932'
 createTime: '2023-05-02T15:07:58.283243Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-02_08_07_57-2454902379167885569'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0502150202'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-02T15:07:58.283243Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-02_08_07_57-2454902379167885569]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-02_08_07_57-2454902379167885569
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-02_08_07_57-2454902379167885569?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-02_08_07_57-2454902379167885569 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:05.770Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.283Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.321Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.388Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.458Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.492Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.535Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.595Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.638Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.668Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.701Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.734Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.769Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.794Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.818Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.852Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.874Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.898Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.921Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.953Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:07.984Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:08.109Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:08.146Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:08.171Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:08.187Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:08.212Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:08.376Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:08.413Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-02_08_07_57-2454902379167885569 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:08.467Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:23.311Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:08:46.281Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:09:17.896Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:09:30.166Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T15:27:29.056Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T16:08:59.869Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T16:10:00.862Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T16:12:01.754Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T16:41:03.957Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T16:42:04.907Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T17:22:06.673Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T17:35:17.594Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T17:56:19.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T18:10:09.984Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T18:31:11.474Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T18:45:12.397Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T19:07:14.549Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T19:22:26.286Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T19:42:17.246Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T19:59:28.343Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-02_08_07_57-2454902379167885569 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T20:00:46.894Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-02_08_07_57-2454902379167885569.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T20:00:46.925Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T20:00:46.967Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T20:00:46.987Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T20:00:47.014Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-02T20:00:47.029Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-02_08_07_57-2454902379167885569?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/2y74azhzaajqc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #981

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/981/display/redirect?page=changes>

Changes:

[noreply] [Python] Add saved_weights example to tf notebook (#26472)


------------------------------------------
[...truncated 34.47 KB...]
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.123 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.123-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3064756 sha256=4ebf8114930976fb489ddc2b9fec603f6986e269c3d2617c8605fa24d495c65e
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.123 botocore-1.29.123 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.32.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.75.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.29.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0501150203.1682953681.065197/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0501150203.1682953681.065197/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0501150203.1682953681.065197/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0501150203.1682953681.065197/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230501150801066363-5261'
 createTime: '2023-05-01T15:08:02.269322Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-05-01_08_08_01-16137296567276721078'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0501150203'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-05-01T15:08:02.269322Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-05-01_08_08_01-16137296567276721078]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-05-01_08_08_01-16137296567276721078
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-01_08_08_01-16137296567276721078?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-01_08_08_01-16137296567276721078 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:06.561Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:07.815Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:07.846Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:07.911Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:07.982Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.009Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.051Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.103Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.134Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.169Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.193Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.225Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.257Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.290Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.312Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.333Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.354Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.384Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.417Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.451Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.474Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.564Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.592Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.624Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.656Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.690Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.842Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.871Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:08.901Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-01_08_08_01-16137296567276721078 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:26.575Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:46.924Z: JOB_MESSAGE_WARNING: Autoscaling: Startup of the **** pool in zone us-central1-b reached 4 ****s, but the goal was 5 ****s. The service will retry. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-05010808-lkq0-harness-45dn' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:46.957Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:08:46.987Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:09:19.377Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:09:25.530Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:09:25.566Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:09:32.285Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:09:46.546Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:09:46.575Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:10:16.529Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T15:24:37.319Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T16:04:05.069Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T16:08:13.857Z: JOB_MESSAGE_WARNING: Autoscaling: Unable to reach resize target in zone us-central1-b. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-05010808-lkq0-harness-wgl3' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T16:08:33.337Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T16:39:15.151Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T16:40:06.189Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T17:11:07.387Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T17:16:08.608Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T17:20:10.487Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T17:51:12.030Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T18:00:14.445Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T18:23:15.367Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T18:38:16.537Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T18:57:27.809Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T19:16:19.053Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T19:34:20.213Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T19:51:21.090Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-05-01_08_08_01-16137296567276721078 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T20:08:59.985Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-05-01_08_08_01-16137296567276721078.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T20:09:00.020Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T20:09:00.060Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T20:09:00.079Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T20:09:00.105Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-01T20:09:00.121Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-01_08_08_01-16137296567276721078?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 2m 58s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/rtsn6can3u4m2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #980

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/980/display/redirect>

Changes:


------------------------------------------
[...truncated 33.59 KB...]
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.123 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.123-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3064756 sha256=5e29ccdc2df046e6f3764e1e75df549ea80390e5db24b8844abbb492223ff7bb
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.123 botocore-1.29.123 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.32.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.75.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.29.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0430150153.1682867274.386199/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0430150153.1682867274.386199/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0430150153.1682867274.386199/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0430150153.1682867274.386199/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230430150754387450-2265'
 createTime: '2023-04-30T15:07:55.519242Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-30_08_07_55-14391295089652186839'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0430150153'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-30T15:07:55.519242Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-30_08_07_55-14391295089652186839]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-30_08_07_55-14391295089652186839
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-30_08_07_55-14391295089652186839?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-30_08_07_55-14391295089652186839 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:04.177Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:10.525Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:12.874Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:12.923Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.008Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.035Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.087Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.138Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.173Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.198Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.221Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.254Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.286Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.313Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.331Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.364Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.392Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.410Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.440Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.465Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.494Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.589Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.641Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.667Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.684Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.710Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.880Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.904Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:13.944Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-30_08_07_55-14391295089652186839 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:18.956Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:55.422Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:08:55.451Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:09:25.162Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:09:28.285Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:09:40.849Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T15:37:48.893Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T16:18:08.238Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T16:20:09.087Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T16:51:10.786Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T16:58:12.650Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T17:24:24.833Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T17:35:17.257Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T17:58:28.764Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T18:12:20.180Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T19:03:26.303Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T19:23:24.650Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T19:41:25.824Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T20:00:26.895Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T20:00:41.664Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-30_08_07_55-14391295089652186839.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T20:00:41.694Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T20:00:41.750Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T20:00:41.775Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T20:00:41.804Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-30T20:00:41.828Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-30_08_07_55-14391295089652186839 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-30_08_07_55-14391295089652186839?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 17s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/ht2qav7lig4p6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #979

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/979/display/redirect>

Changes:


------------------------------------------
[...truncated 33.94 KB...]
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.123 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.123-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3064756 sha256=29c3bbad88ae591f130b4b222e241d8d5da0e1c3ab6c93e9b7bdf0c2588eba3f
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.123 botocore-1.29.123 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.32.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.74.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.29.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0429150214.1682780877.151713/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0429150214.1682780877.151713/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0429150214.1682780877.151713/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0429150214.1682780877.151713/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230429150757152780-9338'
 createTime: '2023-04-29T15:07:58.352423Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-29_08_07_57-10986933795364995144'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0429150214'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-29T15:07:58.352423Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-29_08_07_57-10986933795364995144]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-29_08_07_57-10986933795364995144
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-29_08_07_57-10986933795364995144?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-29_08_07_57-10986933795364995144 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:04.329Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:05.661Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:05.702Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:05.776Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:05.852Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:05.885Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:05.940Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.004Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.052Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.083Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.115Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.146Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.180Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.212Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.244Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.279Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.303Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.341Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.372Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.403Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.433Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.541Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.581Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.611Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.642Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.675Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.848Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.882Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:06.933Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-29_08_07_57-10986933795364995144 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:19.295Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:46.270Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:08:46.308Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:09:06.049Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:09:18.003Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:09:30.278Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:15:40.888Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:57:59.627Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T15:59:00.562Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T16:00:01.539Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T16:32:03.781Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T16:34:08.231Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T17:05:08.600Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T17:12:07.923Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T17:38:09.378Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T17:49:10.618Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T18:11:15.584Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T18:25:16.668Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T18:48:18.863Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T19:03:20.894Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T19:25:18.196Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T19:40:19.830Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T20:00:21.476Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T20:00:41.239Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-29_08_07_57-10986933795364995144.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T20:00:41.279Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T20:00:41.369Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T20:00:41.399Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T20:00:41.439Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-29T20:00:41.467Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-29_08_07_57-10986933795364995144 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-29_08_07_57-10986933795364995144?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 12s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/ql3xmpzbavzcm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #978

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/978/display/redirect>

Changes:


------------------------------------------
[...truncated 34.58 KB...]
Collecting botocore<1.30.0,>=1.29.122 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.122-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3062230 sha256=a11674b6a3a05eb7957c1257af4fd8bb7da133179a42e355619fc75247e787a2
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.122 botocore-1.29.122 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.32.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.74.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.11 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.29.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0428150756.1682694744.707490/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0428150756.1682694744.707490/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0428150756.1682694744.707490/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0428150756.1682694744.707490/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230428151224709079-3636'
 createTime: '2023-04-28T15:12:26.104315Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-28_08_12_25-6620465239894155842'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0428150756'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-28T15:12:26.104315Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-28_08_12_25-6620465239894155842]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-28_08_12_25-6620465239894155842
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-28_08_12_25-6620465239894155842?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-28_08_12_25-6620465239894155842 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:36.731Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:37.985Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.044Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.120Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.191Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.227Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.294Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.348Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.388Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.404Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.433Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.457Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.489Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.513Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.541Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.567Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.588Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.620Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.645Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.676Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.709Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.813Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.851Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.881Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.915Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:38.946Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:39.120Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:39.152Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:39.195Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-28_08_12_25-6620465239894155842 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:12:56.370Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:13:20.121Z: JOB_MESSAGE_WARNING: Autoscaling: Startup of the **** pool in zone us-central1-c reached 3 ****s, but the goal was 5 ****s. The service will retry. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04280812-jxez-harness-3slq' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:13:20.155Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:13:20.176Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:13:21.643Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:13:24.066Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:13:24.095Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:14:36.932Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:14:36.955Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:15:12.250Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:15:24.686Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:15:26.965Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:15:26.989Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:15:36.929Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:53:40.633Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:56:42.529Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T15:58:43.792Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T16:26:44.977Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T16:28:56.446Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T17:01:48.294Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T17:11:53.361Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T17:35:54.229Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T17:49:51.442Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T18:12:02.470Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T18:26:53.637Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T18:45:55.488Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T19:04:57.548Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T19:24:12.318Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T19:42:00.095Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-28_08_12_25-6620465239894155842 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T20:00:40.904Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-28_08_12_25-6620465239894155842.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T20:00:40.942Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T20:00:40.983Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T20:00:41.005Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T20:00:41.027Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-28T20:00:41.048Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-28_08_12_25-6620465239894155842?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 51m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/rmfggawu66cbm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #977

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/977/display/redirect>

Changes:


------------------------------------------
[...truncated 33.12 KB...]
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.48.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.48.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.48.0.dev0)
  Using cached cryptography-40.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.48.0.dev0)
  Using cached hypothesis-6.74.0-py3-none-any.whl (409 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.121 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.121-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3062014 sha256=c4b6601284893a831d0f44262a8e940198c721cbf2658693b786af91b99eacb7
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.121 botocore-1.29.121 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.32.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.74.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.29.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0427150219.1682608085.417329/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0427150219.1682608085.417329/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0427150219.1682608085.417329/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0427150219.1682608085.417329/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230427150805418366-9209'
 createTime: '2023-04-27T15:08:06.593165Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-27_08_08_05-2550698046437071704'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0427150219'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-27T15:08:06.593165Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-27_08_08_05-2550698046437071704]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-27_08_08_05-2550698046437071704
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-27_08_08_05-2550698046437071704?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-27_08_08_05-2550698046437071704 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:11.056Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.597Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.629Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.696Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.764Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.794Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.842Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.890Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.928Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.956Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:15.992Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.034Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.069Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.090Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.122Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.154Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.190Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.222Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.250Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.276Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.304Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.386Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.411Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.430Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.460Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.484Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.643Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.672Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:16.716Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-27_08_08_05-2550698046437071704 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:08:48.617Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:09:10.299Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:09:10.330Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:09:29.916Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:09:29.943Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:09:39.924Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:09:43.017Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:09:55.521Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T15:33:53.571Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T16:13:13.195Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T16:16:14.560Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T16:45:15.644Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T16:46:20.929Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T16:47:21.896Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T17:13:44.303Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T17:13:49.792Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T17:21:06.812Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T17:22:11.632Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T17:56:09.321Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T17:57:13.968Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T18:00:11.281Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T18:30:16.141Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T18:32:17.227Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T19:05:14.330Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-27T19:10:15.733Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
Build timed out (after 720 minutes). Marking the build as aborted.
FATAL: command execution failed
hudson.remoting.ChannelClosedException: Channel "hudson.remoting.Channel@3e483a4d:apache-beam-jenkins-4": Remote call on apache-beam-jenkins-4 failed. The channel is closing down or has closed down
	at hudson.remoting.Channel.call(Channel.java:993)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285)
	at com.sun.proxy.$Proxy135.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207)
	at hudson.Launcher$ProcStarter.join(Launcher.java:524)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:321)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814)
	at hudson.model.Build$BuildExecution.build(Build.java:199)
	at hudson.model.Build$BuildExecution.doRun(Build.java:164)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522)
	at hudson.model.Run.execute(Run.java:1896)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44)
	at hudson.model.ResourceController.execute(ResourceController.java:101)
	at hudson.model.Executor.run(Executor.java:442)
Caused by: java.io.IOException
	at hudson.remoting.Channel.close(Channel.java:1470)
	at hudson.remoting.Channel.close(Channel.java:1447)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:902)
	at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:111)
	at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:782)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:68)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
FATAL: Channel "hudson.remoting.Channel@3e483a4d:apache-beam-jenkins-4": Remote call on apache-beam-jenkins-4 failed. The channel is closing down or has closed down
java.io.IOException
	at hudson.remoting.Channel.close(Channel.java:1470)
	at hudson.remoting.Channel.close(Channel.java:1447)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:902)
	at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:111)
	at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:782)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:68)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)
Caused: hudson.remoting.ChannelClosedException: Channel "hudson.remoting.Channel@3e483a4d:apache-beam-jenkins-4": Remote call on apache-beam-jenkins-4 failed. The channel is closing down or has closed down
	at hudson.remoting.Channel.call(Channel.java:993)
	at hudson.Launcher$RemoteLauncher.kill(Launcher.java:1150)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:534)
	at hudson.model.Run.execute(Run.java:1896)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44)
	at hudson.model.ResourceController.execute(ResourceController.java:101)
	at hudson.model.Executor.run(Executor.java:442)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #976

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/976/display/redirect>

Changes:


------------------------------------------
[...truncated 34.18 KB...]
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.120 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.120-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3062015 sha256=6d0d28025cb9b3476cb435427969b23facbbb9b5a319a8a97c3c8472e7855975
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.120 botocore-1.29.120 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.32.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.74.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0426125404.1682521776.353158/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0426125404.1682521776.353158/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0426125404.1682521776.353158/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0426125404.1682521776.353158/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230426150936354144-2101'
 createTime: '2023-04-26T15:09:37.464579Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-26_08_09_36-11763189873239536061'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0426125404'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-26T15:09:37.464579Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-26_08_09_36-11763189873239536061]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-26_08_09_36-11763189873239536061
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-26_08_09_36-11763189873239536061?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-26_08_09_36-11763189873239536061 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:43.609Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:44.598Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:44.669Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:44.740Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:44.801Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:44.827Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:44.887Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:44.934Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:44.976Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.044Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.069Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.099Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.135Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.159Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.190Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.222Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.238Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.269Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.297Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.324Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.426Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.488Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.520Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.550Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.582Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.765Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.794Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:09:45.845Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-26_08_09_36-11763189873239536061 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:10:21.995Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:10:31.881Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:10:31.915Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:10:41.625Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:10:41.653Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:11:01.649Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:11:02.189Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:11:14.663Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T15:19:55.324Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T16:00:19.743Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T16:01:41.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T16:02:11.722Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T16:34:12.945Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T16:37:14.486Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T17:08:15.947Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T17:10:27.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T17:41:29.472Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T17:50:20.924Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T18:16:22.786Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T18:27:23.763Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T18:50:25.738Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T19:04:27.354Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T19:24:39.240Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T19:43:30.410Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T20:02:43.818Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-26_08_09_36-11763189873239536061 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T20:16:00.895Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-26_08_09_36-11763189873239536061.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T20:16:02.234Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T20:16:02.266Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T20:16:02.289Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T20:16:02.322Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-26T20:16:02.338Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-26_08_09_36-11763189873239536061?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 9m 15s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/sls4ui2zdwiuo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #975

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/975/display/redirect>

Changes:


------------------------------------------
[...truncated 34.33 KB...]
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.119 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.119-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3060185 sha256=1378d91f42c037c6b5b0e91bcc321455739f3b161626db932d0bb3b73a4c9c6e
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.119 botocore-1.29.119 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.4 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0425150548.1682435439.311175/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0425150548.1682435439.311175/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0425150548.1682435439.311175/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0425150548.1682435439.311175/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230425151039315746-8295'
 createTime: '2023-04-25T15:10:41.059541Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-25_08_10_40-1429216917815564524'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0425150548'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-25T15:10:41.059541Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-25_08_10_40-1429216917815564524]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-25_08_10_40-1429216917815564524
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-25_08_10_40-1429216917815564524?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-25_08_10_40-1429216917815564524 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:45.087Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.108Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.140Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.208Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.276Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.295Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.347Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.407Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.456Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.472Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.492Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.521Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.547Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.578Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.602Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.634Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.667Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.730Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.764Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.819Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.914Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.956Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:46.985Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:47.005Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:47.029Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:47.197Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:47.227Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:10:47.279Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-25_08_10_40-1429216917815564524 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:11:14.036Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:11:26.824Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:11:26.847Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:11:36.518Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:11:36.552Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:11:57.131Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:12:06.468Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:12:09.647Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:19:51.275Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T15:59:49.906Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T16:01:30.935Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T16:34:02.081Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T16:35:52.917Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T16:40:03.965Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T17:09:55.145Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T17:20:00.454Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T17:45:07.530Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T17:57:09.530Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T18:21:00.873Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T18:34:02.233Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T18:56:03.424Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T19:10:04.368Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T19:33:06.087Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T19:48:07.199Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-25_08_10_40-1429216917815564524 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T20:01:12.659Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-25_08_10_40-1429216917815564524.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T20:01:12.699Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T20:01:12.749Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T20:01:12.774Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T20:01:12.807Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-25T20:01:12.829Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-25_08_10_40-1429216917815564524?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 55s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/g7s6po6zy5vjw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #974

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/974/display/redirect>

Changes:


------------------------------------------
[...truncated 33.55 KB...]
  Using cached azure_core-1.26.4-py3-none-any.whl (173 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.48.0.dev0)
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.118 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.118-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
WARNING: Retrying (Retry(total=9, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProtocolError('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))': /simple/threadpoolctl/
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3059756 sha256=78412f691a6450d52d94311d1f2124b8f1b3309f4a0759bba1d24ef68c360f8b
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.118 botocore-1.29.118 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0424150216.1682348874.062000/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0424150216.1682348874.062000/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0424150216.1682348874.062000/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0424150216.1682348874.062000/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230424150754063048-5682'
 createTime: '2023-04-24T15:07:55.277211Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-24_08_07_54-10006337955682986661'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0424150216'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-24T15:07:55.277211Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-24_08_07_54-10006337955682986661]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-24_08_07_54-10006337955682986661
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-24_08_07_54-10006337955682986661?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-24_08_07_54-10006337955682986661 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:07:58.979Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.025Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.052Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.110Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.179Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.198Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.244Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.306Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.338Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.364Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.385Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.411Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.431Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.463Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.492Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.511Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.541Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.567Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.593Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.617Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.640Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.726Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.747Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.775Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.798Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.831Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:01.992Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:02.018Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:02.051Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-24_08_07_54-10006337955682986661 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:24.147Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:08:47.331Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:09:17.807Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:09:30.318Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T15:20:52.194Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T16:08:00.306Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T16:09:01.656Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T16:11:12.553Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T16:42:03.702Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T16:45:04.817Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T17:16:06.093Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T17:18:17.446Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T17:35:08.457Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T17:51:09.283Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T18:00:10.192Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T18:29:11.752Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T19:06:13.190Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T19:43:24.687Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-24_08_07_54-10006337955682986661 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T20:01:03.541Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-24_08_07_54-10006337955682986661.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T20:01:03.578Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T20:01:03.622Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T20:01:03.638Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T20:01:03.656Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-24T20:01:03.681Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-24_08_07_54-10006337955682986661?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 17s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/2aza2k5gyt2iu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #973

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/973/display/redirect>

Changes:


------------------------------------------
[...truncated 1.16 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:00.209Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:00.240Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:10.276Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:10.300Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:20.231Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:20.253Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:30.258Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:30.273Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:40.204Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:40.218Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:50.240Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:45:50.254Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:00.270Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:00.287Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:10.206Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:10.223Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:20.264Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:20.287Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:30.231Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:30.253Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:40.247Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:40.261Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:50.216Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:46:50.237Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:00.197Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:00.220Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:10.198Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:10.212Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:20.301Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:20.327Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:30.232Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:30.246Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:40.269Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:40.282Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:50.257Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:47:50.271Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:00.283Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:00.314Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:10.197Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:10.211Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:20.198Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:20.221Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:30.233Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:30.247Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:40.254Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:40.276Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:50.202Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:48:50.223Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:00.201Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:00.225Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:10.257Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:10.277Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:20.297Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:20.321Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:30.286Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:30.315Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:40.259Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:40.273Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:50.206Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:49:50.219Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:00.319Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:00.338Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:10.275Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:10.295Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:20.291Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:20.313Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:30.270Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:30.290Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:40.242Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:40.255Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:50.277Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:50:50.291Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:00.281Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:00.304Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:10.223Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:10.242Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:20.274Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:20.289Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:30.258Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:30.272Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:40.269Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:40.288Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:50.219Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:51:50.233Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:00.214Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:00.230Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:10.219Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:10.246Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:20.169Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:20.185Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:30.289Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:30.311Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:40.287Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:40.300Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:50.260Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:52:50.274Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:00.249Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:00.263Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:10.224Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:10.240Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:20.249Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:20.276Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:30.218Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:30.233Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:40.233Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:40.247Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:50.243Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:53:50.264Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:00.215Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:00.248Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:10.258Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:10.283Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:20.317Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:20.330Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:30.257Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:30.282Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:40.219Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:40.254Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:50.230Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:54:50.247Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:00.206Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:00.219Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:10.178Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:10.191Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:20.247Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:20.261Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:30.185Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:30.209Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:40.234Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:40.246Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:50.276Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:55:50.288Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:00.207Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:00.220Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:10.242Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:10.256Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:20.242Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:20.265Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:30.250Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:30.268Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:40.274Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:40.300Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:50.240Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:56:50.264Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:00.221Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:00.241Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:10.236Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:10.263Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:20.189Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:20.213Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:30.215Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:30.232Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:40.274Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:40.288Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:50.269Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:57:50.288Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:00.262Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:00.275Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:10.291Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:10.307Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:20.234Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:20.247Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:30.285Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:30.304Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:40.258Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:40.271Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:50.249Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:58:50.274Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:00.268Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:00.285Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:10.223Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:10.250Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:20.215Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:20.239Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:30.254Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:30.268Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:40.207Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:40.223Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:50.206Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T19:59:50.221Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T20:00:00.251Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T20:00:00.268Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T20:00:10.282Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T20:00:10.308Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T20:00:20.280Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T20:00:20.305Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T20:00:30.246Z: JOB_MESSAGE_ERROR: Startup of the **** pool in zone us-central1-b failed to bring up any of the desired 5 ****s. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#****-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04230807-h6hj-harness-83z5' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-23T20:00:30.275Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-23_08_07_56-8042074743962845450 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-23_08_07_56-8042074743962845450?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 14s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/eo6mi336nvaii

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #972

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/972/display/redirect>

Changes:


------------------------------------------
[...truncated 33.67 KB...]
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.118 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.118-py3-none-any.whl (10.7 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3059756 sha256=c3ebe5a629c9e673e94027125277491a8a84476d71e3a715ce06c676898a627d
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.118 botocore-1.29.118 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0422150210.1682176070.290598/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0422150210.1682176070.290598/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0422150210.1682176070.290598/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0422150210.1682176070.290598/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230422150750291584-9302'
 createTime: '2023-04-22T15:07:51.381045Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-22_08_07_50-15592787768425368637'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0422150210'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-22T15:07:51.381045Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-22_08_07_50-15592787768425368637]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-22_08_07_50-15592787768425368637
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-22_08_07_50-15592787768425368637?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-22_08_07_50-15592787768425368637 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:07:56.216Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.337Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.518Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.585Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.648Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.674Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.730Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.798Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.840Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.875Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.909Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.943Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.971Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:02.995Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.027Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.060Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.094Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.129Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.162Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.183Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.205Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.301Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.330Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.360Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.393Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.417Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.639Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.667Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:03.693Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-22_08_07_50-15592787768425368637 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:16.090Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:42.504Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:42.537Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:08:52.289Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:09:11.691Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:09:21.139Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T15:30:08.958Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T16:16:07.204Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T16:17:12.040Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T16:44:19.426Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T16:49:14.599Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T16:50:15.492Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T17:25:13.166Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T17:29:14.193Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T17:59:25.368Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T18:09:16.718Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T18:33:18.171Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T18:46:29.525Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T19:08:21.313Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T19:24:23.891Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T19:44:24.844Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-22_08_07_50-15592787768425368637 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T20:00:44.937Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-22_08_07_50-15592787768425368637.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T20:00:44.974Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T20:00:45.029Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T20:00:45.053Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T20:00:45.074Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-22T20:00:45.095Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-22_08_07_50-15592787768425368637?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/ejqdkhsvwfnas

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #971

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/971/display/redirect>

Changes:


------------------------------------------
[...truncated 34.60 KB...]
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3059750 sha256=f5da733f516f4987814b9b387738d6078a29e8bbe2298b0b8ff00db3eb7f9b88
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.117 botocore-1.29.117 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0421132215.1682089709.800273/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0421132215.1682089709.800273/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0421132215.1682089709.800273/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0421132215.1682089709.800273/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230421150829801292-1851'
 createTime: '2023-04-21T15:08:30.987025Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-21_08_08_30-502519205835886453'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0421132215'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-21T15:08:30.987025Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-21_08_08_30-502519205835886453]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-21_08_08_30-502519205835886453
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-21_08_08_30-502519205835886453?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-21_08_08_30-502519205835886453 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:36.447Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.562Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.587Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.638Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.714Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.738Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.792Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.859Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.904Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.937Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:42.970Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.004Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.037Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.070Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.104Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.137Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.169Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.202Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.235Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.270Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.303Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.410Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.439Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.469Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.503Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.539Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.709Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.744Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:08:43.788Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-21_08_08_30-502519205835886453 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:09:09.863Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:09:22.505Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:09:54.700Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:10:05.305Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T15:15:19.085Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T16:01:46.566Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T16:03:37.596Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T16:04:38.753Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T16:34:40.541Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T16:35:50.581Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T16:48:41.665Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T17:00:42.987Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T17:06:44.575Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T17:17:56.765Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T17:24:58.105Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T17:35:49.245Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T17:44:50.194Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T17:56:22.136Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T17:59:53.985Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T18:11:55.003Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T18:22:07.146Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T18:32:58.250Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T18:40:09.683Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T18:49:10.929Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T18:58:11.856Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T19:09:03.034Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T19:16:14.433Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T19:25:15.082Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T19:35:07.668Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T19:47:09.861Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T19:53:12.114Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-21_08_08_30-502519205835886453 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T20:01:15.541Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-21_08_08_30-502519205835886453.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T20:01:15.584Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T20:01:15.656Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T20:01:15.684Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T20:01:15.709Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T20:01:15.741Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-21_08_08_30-502519205835886453?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 36s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/q6yb3qy7ulma4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #970

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/970/display/redirect>

Changes:


------------------------------------------
[...truncated 34.30 KB...]
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.116 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.116-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3056942 sha256=be54011d33c5b384e91ff2f282f2373790b6679f50bf2337cdcda57a37d1ec43
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.116 botocore-1.29.116 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0420125438.1682005872.387066/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0420125438.1682005872.387066/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0420125438.1682005872.387066/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0420125438.1682005872.387066/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230420155112388095-5759'
 createTime: '2023-04-20T15:51:13.688546Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-20_08_51_13-7430437119520329618'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0420125438'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-20T15:51:13.688546Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-20_08_51_13-7430437119520329618]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-20_08_51_13-7430437119520329618
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-20_08_51_13-7430437119520329618?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-20_08_51_13-7430437119520329618 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:21.536Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:22.739Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:22.784Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:22.847Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:22.920Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:22.953Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.016Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.074Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.117Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.143Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.181Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.215Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.261Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.288Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.324Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.356Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.396Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.419Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.459Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.494Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.530Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.618Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.649Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.669Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.692Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.719Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-20_08_51_13-7430437119520329618 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.904Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.944Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:23.970Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:51:42.298Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:52:07.953Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:52:41.241Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T15:52:52.042Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T16:20:41.055Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T17:05:08.266Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T17:07:49.845Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T17:08:51.562Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T17:36:52.999Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T17:37:53.854Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T17:51:54.942Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T18:11:10.630Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T18:11:57.848Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T18:23:09.524Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T18:31:10.780Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T18:46:09.018Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T18:54:04.544Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T19:01:15.221Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T19:09:17.163Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T19:21:08.860Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T19:31:10.659Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T19:38:12.372Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T19:46:14.561Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T19:57:16.018Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T20:00:41.528Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-20_08_51_13-7430437119520329618.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T20:00:41.566Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T20:00:41.613Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T20:00:41.631Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T20:00:41.659Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T20:00:41.676Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-20_08_51_13-7430437119520329618 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-20_08_51_13-7430437119520329618?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 13m 41s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/owdutdfcg4pom

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #969

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/969/display/redirect>

Changes:


------------------------------------------
[...truncated 33.03 KB...]
  Using cached boto3-1.26.115-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.115 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.115-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3056389 sha256=5810e80e64a526ef116cf9449f33fd3cb44bb07addd712ea8c1e8568c4626ffc
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.115 botocore-1.29.115 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0419150211.1681916866.706023/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0419150211.1681916866.706023/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0419150211.1681916866.706023/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0419150211.1681916866.706023/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230419150746707300-8526'
 createTime: '2023-04-19T15:07:47.815073Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-19_08_07_47-8149222094790951966'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0419150211'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-19T15:07:47.815073Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-19_08_07_47-8149222094790951966]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-19_08_07_47-8149222094790951966
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-19_08_07_47-8149222094790951966?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-19_08_07_47-8149222094790951966 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:07:59.087Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:05.415Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:05.667Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:05.730Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:05.807Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:05.853Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:05.919Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:05.983Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.028Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.049Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.080Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.113Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.147Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.169Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.202Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.223Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.245Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.281Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.314Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.346Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.369Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.477Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.506Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.537Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.559Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.591Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.757Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.786Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:06.832Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:27.140Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-19_08_07_47-8149222094790951966 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:50.075Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:50.097Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:08:59.870Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:09:23.085Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:09:33.862Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T15:19:26.368Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T16:03:03.522Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T16:07:07.964Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T16:37:05.039Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T16:45:06.130Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T17:11:08.258Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T17:23:22.656Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T17:46:10.013Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T18:00:14.922Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T18:21:11.951Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T18:35:17.009Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T18:56:18.141Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T19:13:19.311Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T19:31:17.348Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T19:49:18.942Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-19_08_07_47-8149222094790951966 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T20:01:41.079Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-19_08_07_47-8149222094790951966.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T20:01:41.148Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T20:01:41.268Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T20:01:41.335Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T20:01:41.379Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-19T20:01:41.409Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-19_08_07_47-8149222094790951966?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 59s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/jweq4ysywpfdq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #968

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/968/display/redirect>

Changes:


------------------------------------------
[...truncated 33.89 KB...]
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3056389 sha256=dbf3f9178ff4ed2820a91c03bdb977cd7d0191d979528039e367e9b625114833
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.115 botocore-1.29.115 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0418125358.1681830467.030297/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0418125358.1681830467.030297/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0418125358.1681830467.030297/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0418125358.1681830467.030297/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230418150747031381-9619'
 createTime: '2023-04-18T15:07:48.157634Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-18_08_07_47-502457591514789272'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0418125358'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-18T15:07:48.157634Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-18_08_07_47-502457591514789272]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-18_08_07_47-502457591514789272
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-18_08_07_47-502457591514789272?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-18_08_07_47-502457591514789272 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:54.894Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.292Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.327Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.395Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.473Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.500Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.550Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.615Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.652Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.682Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.716Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.744Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.776Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.812Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.835Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.867Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.888Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.924Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.958Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:57.979Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:58Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:58.080Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:58.108Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:58.138Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:58.171Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:58.205Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-18_08_07_47-502457591514789272 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:58.372Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:58.414Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:07:58.436Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:08:18.557Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:08:41.530Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:08:41.555Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:08:51.347Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:09:14.431Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:09:25.524Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T15:36:23.339Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T16:20:55.642Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T16:24:01.722Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T16:26:02.880Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T16:54:10.430Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T16:56:05.671Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T17:08:03.006Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T17:27:07.950Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T17:31:19.222Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T17:37:06.611Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T17:46:08.626Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T18:02:13.492Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T18:03:24.945Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T18:14:12.741Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T18:21:17.667Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T18:37:18.995Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T18:48:16.912Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T18:51:22.004Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T18:59:19.521Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T19:12:20.753Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T19:23:35.768Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T19:30:23.620Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T19:38:28.760Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T19:49:29.782Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T20:00:27.547Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T20:00:44.871Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-18_08_07_47-502457591514789272.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T20:00:44.905Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T20:00:44.948Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T20:00:44.974Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T20:00:44.998Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-18T20:00:45.022Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-18_08_07_47-502457591514789272 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-18_08_07_47-502457591514789272?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 31s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/3ivcb5elm2yu4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #967

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/967/display/redirect?page=changes>

Changes:

[noreply] [AWS] Support usage of StsAssumeRoleWithWebIdentityCredentialsProvider


------------------------------------------
[...truncated 33.96 KB...]
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3056292 sha256=494e18f6255cd071d5261ce1016a9e4c5cb386ed58b3643b5a133c0c3aa0be60
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.114 botocore-1.29.114 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0417150259.1681744071.187585/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0417150259.1681744071.187585/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0417150259.1681744071.187585/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0417150259.1681744071.187585/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230417150751188633-5766'
 createTime: '2023-04-17T15:07:52.270988Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-17_08_07_51-1474147010429896761'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0417150259'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-17T15:07:52.270988Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-17_08_07_51-1474147010429896761]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-17_08_07_51-1474147010429896761
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-17_08_07_51-1474147010429896761?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-17_08_07_51-1474147010429896761 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:57.304Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:58.741Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:58.771Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:58.835Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:58.908Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:58.934Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.001Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.063Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.108Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.139Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.171Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.202Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.237Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.259Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.291Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.323Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.359Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.388Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.420Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.452Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.485Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.570Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.619Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.647Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.681Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.705Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.890Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.916Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:07:59.951Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-17_08_07_51-1474147010429896761 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:08:17.219Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:08:48.751Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:09:18.468Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:09:28.967Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T15:29:48.997Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T16:13:01.767Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T16:13:56.425Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T16:15:57.192Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T16:17:04.445Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T16:44:59.177Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T16:55:56.355Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T16:59:00.909Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T16:59:57.973Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T17:19:42.978Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T17:22:00.260Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T17:37:11.558Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T17:39:02.057Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T17:53:07.808Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T18:00:09.002Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T18:08:07.282Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T18:19:08.143Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T18:26:13.236Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T18:34:11.557Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T18:41:22.354Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T18:53:13.126Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T19:01:14.146Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T19:08:15.075Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T19:15:16.575Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T19:29:18.011Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T19:35:29.151Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T19:40:19.924Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T19:51:20.924Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-17_08_07_51-1474147010429896761 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T20:00:57.018Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-17_08_07_51-1474147010429896761.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T20:00:57.047Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T20:00:57.101Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T20:00:57.125Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T20:00:57.155Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-17T20:00:57.180Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-17_08_07_51-1474147010429896761?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 17s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
	at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
	at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
	at java.lang.reflect.WeakCache.get(WeakCache.java:127)
	at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
	at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64)
	at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59)
	at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

Publishing build scan...
https://gradle.com/s/tyhm3mc2o2edk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #966

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/966/display/redirect>

Changes:


------------------------------------------
[...truncated 32.76 KB...]
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.114 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.114-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3056292 sha256=1af3534330f304bbcf6e93abdb1abc340ff74896c8e71343cbd0d73711d96ef0
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.114 botocore-1.29.114 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0416150157.1681657666.114125/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0416150157.1681657666.114125/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0416150157.1681657666.114125/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0416150157.1681657666.114125/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230416150746115151-1833'
 createTime: '2023-04-16T15:07:47.218416Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-16_08_07_46-4242998773516368531'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0416150157'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-16T15:07:47.218416Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-16_08_07_46-4242998773516368531]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-16_08_07_46-4242998773516368531
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-16_08_07_46-4242998773516368531?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-16_08_07_46-4242998773516368531 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:52.291Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.403Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.540Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.599Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.657Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.686Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.744Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.807Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.848Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.873Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.893Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.915Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.937Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:58.968Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.002Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.036Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.057Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.090Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.121Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.154Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.186Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.287Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.315Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.348Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.380Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.412Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.578Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.605Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:07:59.664Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-16_08_07_46-4242998773516368531 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:08:25.437Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:08:36.747Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:09:05.741Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:09:13.380Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:09:41.788Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:45:49.464Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:49:54.232Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:55:51.396Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T15:56:55.914Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T16:17:57.543Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T16:18:54.666Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T16:36:55.422Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T16:51:00.204Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T16:56:07.292Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T16:57:58.680Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T17:11:59.538Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T17:25:50.530Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T17:30:15.499Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T17:38:07.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T17:46:04.354Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T17:58:04.953Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T18:05:19.029Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T18:07:14.492Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T18:21:11.731Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T18:30:22.600Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T18:36:13.410Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T18:45:14.314Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T18:54:15.285Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T19:02:47.713Z: JOB_MESSAGE_WARNING: Autoscaling: Unable to reach resize target in zone us-central1-b. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'load-tests-python-dataflo-04160807-kgv0-harness-d57s' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-b' does not have enough resources available to fulfill the request.  '(resource type:compute)'.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T19:03:07.589Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T19:11:23.167Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T19:20:21.411Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T19:26:22.203Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T19:36:24.305Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T19:45:20.398Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T19:52:27.849Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T20:00:29.434Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T20:00:40.722Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-16_08_07_46-4242998773516368531.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T20:00:40.742Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T20:00:40.779Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T20:00:40.797Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T20:00:40.824Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-16T20:00:40.841Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-16_08_07_46-4242998773516368531 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-16_08_07_46-4242998773516368531?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 14s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5pl2cfvvmoaga

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #965

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/965/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #26284: Fix GroupIntoBatches hold


------------------------------------------
[...truncated 32.40 KB...]
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.48.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.48.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.114 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached botocore-1.29.114-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.22.0,>=0.8->apache-beam==2.48.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3056292 sha256=e4751ab0cd0565c5276dc3028eb19117564fa009a508d3a0ee5d081c96a92209
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-22.2.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.114 botocore-1.29.114 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.71.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0415150153.1681571282.546966/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0415150153.1681571282.546966/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0415150153.1681571282.546966/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0415150153.1681571282.546966/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230415150802548135-3541'
 createTime: '2023-04-15T15:08:05.109256Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-15_08_08_03-4062531744919047061'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0415150153'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-15T15:08:05.109256Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-15_08_08_03-4062531744919047061]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-15_08_08_03-4062531744919047061
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-15_08_08_03-4062531744919047061?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-15_08_08_03-4062531744919047061 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:10.346Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:11.857Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:11.890Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:11.954Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.008Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.036Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.090Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.134Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.177Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.211Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.239Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.272Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.293Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.317Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.339Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.370Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.403Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.435Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.458Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.481Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.513Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.590Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.630Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.661Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.694Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.715Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.890Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.914Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:12.932Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-15_08_08_03-4062531744919047061 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:42.929Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:08:50.898Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:09:17.604Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:09:25.827Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:09:27.463Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:46:04.503Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:47:08.895Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T15:49:15.705Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T16:13:07.928Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T16:16:12.851Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T16:30:09.971Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T16:46:14.665Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T16:47:15.497Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T17:05:12.693Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T17:06:13.747Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T17:15:24.543Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T17:27:15.410Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T17:38:16.183Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T17:49:21.215Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T17:52:22.209Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T18:05:19.060Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T18:15:20.065Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T18:16:24.577Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T18:32:26.174Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T18:41:23.339Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T18:48:34.209Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T18:59:24.876Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T19:17:45.781Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T19:24:26.753Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T19:38:27.628Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T19:48:30.260Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T19:52:31.379Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-15_08_08_03-4062531744919047061 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T20:00:51.882Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-15_08_08_03-4062531744919047061.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T20:00:51.913Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T20:00:51.965Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T20:00:51.984Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T20:00:52.005Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-15T20:00:52.025Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-15_08_08_03-4062531744919047061?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 8s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6drfg5ymrxbec

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #964

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/964/display/redirect>

Changes:


------------------------------------------
[...truncated 27.04 KB...]
  Using cached azure_core-1.26.4-py3-none-any.whl (173 kB)
Collecting azure-identity<2,>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.113
  Using cached botocore-1.29.113-py3-none-any.whl (10.6 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3055201 sha256=e0d55f5d9957b7048218a4387e3de0102023e80d9c7651e22df68ed71292855e
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-22.2.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.113 botocore-1.29.113 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.71.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.0 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0414150209.1681484878.406083/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0414150209.1681484878.406083/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0414150209.1681484878.406083/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0414150209.1681484878.406083/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230414150758407150-5157'
 createTime: '2023-04-14T15:07:59.512196Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-14_08_07_59-82375092236442943'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0414150209'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-14T15:07:59.512196Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-14_08_07_59-82375092236442943]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-14_08_07_59-82375092236442943
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-14_08_07_59-82375092236442943?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-14_08_07_59-82375092236442943 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:09.128Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:15.418Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:15.695Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:15.747Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:15.804Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:15.834Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:15.891Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:15.956Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:15.990Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.017Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.048Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.081Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.113Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.146Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.178Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.209Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.242Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.274Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.305Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.328Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.360Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.430Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.465Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.491Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.523Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.545Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.708Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.730Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:16.773Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-14_08_07_59-82375092236442943 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:28.394Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:08:56.706Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:09:26.621Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:09:34.505Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T15:26:29.739Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T16:02:10.250Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T16:04:13.994Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T16:05:15.352Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T16:13:12.134Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T16:29:18.036Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T16:33:15.792Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T16:40:16.598Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T16:44:29.021Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T17:00:20.383Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T17:09:21.486Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T17:13:32.364Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T17:21:33.092Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T17:26:25.196Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T17:38:29.902Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T17:41:30.947Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T17:54:28.051Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T18:01:40.048Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T18:12:31.716Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T18:20:36.275Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T18:21:37.307Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T18:28:54.321Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T18:45:35.762Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T18:53:46.505Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T19:00:37.288Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T19:04:38.424Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T19:18:39.907Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T19:20:51.067Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T19:33:52.747Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T19:40:43.759Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T19:49:45.038Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T20:00:41.798Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-14_08_07_59-82375092236442943.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T20:00:41.851Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T20:00:41.890Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T20:00:41.914Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T20:00:41.966Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-14T20:00:42.012Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-14_08_07_59-82375092236442943 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-14_08_07_59-82375092236442943?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 5s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/er3arrhjfbwjm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #963

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/963/display/redirect>

Changes:


------------------------------------------
[...truncated 25.92 KB...]
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.4.47-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.71.0-py3-none-any.whl (406 kB)
Collecting azure-storage-blob<13,>=12.3.2
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0
  Using cached azure_core-1.26.4-py3-none-any.whl (173 kB)
Collecting azure-identity<2,>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.112
  Using cached botocore-1.29.112-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3055201 sha256=b566f594f49af105b9127d065e1b3c41591a6e1b2648f2b536b5e888ba36a48d
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-22.2.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.112 botocore-1.29.112 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.71.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.0 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0413150218.1681399148.189069/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0413150218.1681399148.189069/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0413150218.1681399148.189069/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0413150218.1681399148.189069/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230413151908190219-8992'
 createTime: '2023-04-13T15:19:09.206664Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-13_08_19_08-6932883193781602487'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0413150218'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-13T15:19:09.206664Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-13_08_19_08-6932883193781602487]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-13_08_19_08-6932883193781602487
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-13_08_19_08-6932883193781602487?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-13_08_19_08-6932883193781602487 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:15.108Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.004Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.032Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.099Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.166Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.197Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.248Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.305Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.348Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.371Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.392Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.413Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.464Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.499Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.521Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.551Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.585Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.617Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.650Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.683Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.716Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.821Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.850Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.881Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.912Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:17.945Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:18.120Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:18.146Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:18.187Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-13_08_19_08-6932883193781602487 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:47.917Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:19:56.512Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:20:23.317Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:20:31.357Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:23:45.218Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T15:59:12.384Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T16:00:26.766Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T16:01:37.852Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T16:27:24.934Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T16:37:20.072Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T17:02:27.387Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T17:16:18.301Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T17:28:19.019Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T17:36:20.035Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T17:54:21.717Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T17:55:22.841Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T19:12:24.558Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T19:13:26.125Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T19:47:27.626Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T19:49:27.725Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T19:50:29.538Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-13_08_19_08-6932883193781602487 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T20:01:11.141Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-13_08_19_08-6932883193781602487.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T20:01:11.177Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T20:01:11.236Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T20:01:11.257Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T20:01:11.279Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-13T20:01:11.314Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-13_08_19_08-6932883193781602487?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 44m 12s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4tl2nuyvus2va

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #962

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/962/display/redirect>

Changes:


------------------------------------------
[...truncated 24.10 KB...]
Collecting scikit-learn>=0.20.0
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.4.47-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.71.0-py3-none-any.whl (406 kB)
Collecting boto3>=1.9
  Using cached boto3-1.26.111-py3-none-any.whl (135 kB)
Collecting cachetools<5,>=3.1.0
  Using cached cachetools-4.2.4-py3-none-any.whl (10 kB)
Collecting google-apitools<0.5.32,>=0.5.31
  Using cached google_apitools-0.5.31-py3-none-any.whl
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-2.17.2-py2.py3-none-any.whl (178 kB)
Collecting google-auth-httplib2<0.2.0,>=0.1.0
  Using cached google_auth_httplib2-0.1.0-py2.py3-none-any.whl (9.3 kB)
Collecting google-cloud-datastore<3,>=2.0.0
  Using cached google_cloud_datastore-2.15.1-py2.py3-none-any.whl (175 kB)
Collecting google-cloud-pubsub<3,>=2.1.0
  Using cached google_cloud_pubsub-2.16.0-py2.py3-none-any.whl (263 kB)
Collecting google-cloud-pubsublite<2,>=1.2.0
  Using cached google_cloud_pubsublite-1.7.0-py2.py3-none-any.whl (273 kB)
Collecting google-cloud-bigquery<4,>=2.0.0
  Using cached google_cloud_bigquery-3.9.0-py2.py3-none-any.whl (217 kB)
Collecting google-cloud-bigquery-storage<3,>=2.6.3
  Using cached google_cloud_bigquery_storage-2.19.1-py2.py3-none-any.whl (190 kB)
Collecting google-cloud-core<3,>=2.0.0
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<3,>=2.0.0
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0
  Using cached google_cloud_spanner-3.30.0-py2.py3-none-any.whl (327 kB)
Collecting google-cloud-dlp<4,>=3.0.0
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0
  Using cached google_cloud_language-2.9.1-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting azure-storage-blob>=12.3.2
  Using cached azure_storage_blob-12.15.0-py3-none-any.whl (387 kB)
Collecting azure-core>=1.7.0
  Using cached azure_core-1.26.4-py3-none-any.whl (173 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from azure-core>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.111
  Using cached botocore-1.29.111-py3-none-any.whl (10.6 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.0)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: importlib-metadata>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3055111 sha256=d01f659e534bf6ec0b49290ca22fc1330ad3b88cf3871e6c47875d16909ab728
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-22.2.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.111 botocore-1.29.111 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.30.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.71.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.0 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "/home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/sdks/python/build/apache-beam.tar.gz" to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0412150211.1681312077.404226/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0412150211.1681312077.404226/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0412150211.1681312077.404226/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0412150211.1681312077.404226/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230412150757405277-5370'
 createTime: '2023-04-12T15:07:58.430141Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-12_08_07_57-12700188166399323050'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0412150211'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-12T15:07:58.430141Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-12_08_07_57-12700188166399323050]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-12_08_07_57-12700188166399323050
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-12_08_07_57-12700188166399323050?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-12_08_07_57-12700188166399323050 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:03.816Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.036Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.059Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.122Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.185Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.204Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.259Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.307Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.335Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.363Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.388Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.417Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.443Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.473Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.502Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.525Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.560Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.593Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.625Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.659Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.691Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.798Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.818Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.847Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.879Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:06.910Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:07.064Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:07.088Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:07.120Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-12_08_07_57-12700188166399323050 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:39.608Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:08:56.106Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:09:23.722Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:09:31.697Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-12T15:25:48.793Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-5' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285)
	at com.sun.proxy.$Proxy139.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207)
	at hudson.Launcher$ProcStarter.join(Launcher.java:524)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814)
	at hudson.model.Build$BuildExecution.build(Build.java:199)
	at hudson.model.Build$BuildExecution.doRun(Build.java:164)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522)
	at hudson.model.Run.execute(Run.java:1896)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44)
	at hudson.model.ResourceController.execute(ResourceController.java:101)
	at hudson.model.Executor.run(Executor.java:442)
Caused by: java.io.IOException: Pipe closed after 0 cycles
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126)
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105)
	at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:94)
	at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:75)
	at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:105)
	at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-5 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #961

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/961/display/redirect>

Changes:


------------------------------------------
[...truncated 26.49 KB...]
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.71.0-py3-none-any.whl (406 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.110
  Using cached botocore-1.29.110-py3-none-any.whl (10.6 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.0)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3055111 sha256=f1a623d6bfa8a48650147d8e2434513d32dca32b3b2c78cc23d27b3c98f48950
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-22.2.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.110 botocore-1.29.110 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.30.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.71.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.0 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0411150154.1681225670.616467/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0411150154.1681225670.616467/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0411150154.1681225670.616467/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0411150154.1681225670.616467/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230411150750617475-1908'
 createTime: '2023-04-11T15:07:51.626428Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-11_08_07_51-5624438439069575244'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0411150154'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-11T15:07:51.626428Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-11_08_07_51-5624438439069575244]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-11_08_07_51-5624438439069575244
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-11_08_07_51-5624438439069575244?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-11_08_07_51-5624438439069575244 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:07:59.432Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.237Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.264Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.316Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.374Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.402Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.446Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.501Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.529Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.559Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.590Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.624Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.656Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.689Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.715Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.737Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.762Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.785Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.818Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.844Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.870Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.962Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:01.992Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:02.024Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:02.045Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:02.081Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:02.233Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:02.253Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:02.291Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-11_08_07_51-5624438439069575244 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:21.919Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:08:57.430Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:09:24.936Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:09:33.119Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:21:46.712Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:57:14.581Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:58:19.639Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T15:59:20.466Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T16:23:21.448Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T16:24:22.222Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T16:42:33.388Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T16:48:20.950Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T16:56:35.472Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T17:07:23.185Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T17:18:23.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T17:24:20.065Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T17:29:37.468Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T17:42:29.484Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T17:43:31.216Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T18:00:42.582Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T18:05:33.556Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T18:16:34.343Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T18:27:36.051Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T18:38:47.419Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T18:43:38.344Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T18:54:39.254Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T18:55:40.517Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T19:13:41.759Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T19:20:42.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T19:38:43.588Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T19:46:45.552Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-11_08_07_51-5624438439069575244 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T20:00:56.234Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-11_08_07_51-5624438439069575244.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T20:00:56.272Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T20:00:56.312Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T20:00:56.331Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T20:00:56.359Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-11T20:00:56.375Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-11_08_07_51-5624438439069575244?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 1s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5vapxkbudmgyo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_LoadTests_Python_Combine_Dataflow_Streaming - Build # 959 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Python_Combine_Dataflow_Streaming - Build # 959 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/959/ to view the results.

beam_LoadTests_Python_Combine_Dataflow_Streaming - Build # 958 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_LoadTests_Python_Combine_Dataflow_Streaming - Build # 958 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/958/ to view the results.

Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #956

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/956/display/redirect>

Changes:


------------------------------------------
[...truncated 23.75 KB...]
Collecting pytest<8.0,>=7.1.2
  Using cached pytest-7.2.2-py3-none-any.whl (317 kB)
Collecting pytest-xdist<4,>=2.5.0
  Using cached pytest_xdist-3.2.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.4.47-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.70.2-py3-none-any.whl (403 kB)
Collecting boto3>=1.9
  Using cached boto3-1.26.107-py3-none-any.whl (135 kB)
Collecting cachetools<5,>=3.1.0
  Using cached cachetools-4.2.4-py3-none-any.whl (10 kB)
Collecting google-apitools<0.5.32,>=0.5.31
  Using cached google_apitools-0.5.31-py3-none-any.whl
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-2.17.2-py2.py3-none-any.whl (178 kB)
Collecting google-auth-httplib2<0.2.0,>=0.1.0
  Using cached google_auth_httplib2-0.1.0-py2.py3-none-any.whl (9.3 kB)
Collecting google-cloud-datastore<3,>=2.0.0
  Using cached google_cloud_datastore-2.15.1-py2.py3-none-any.whl (175 kB)
Collecting google-cloud-pubsub<3,>=2.1.0
  Using cached google_cloud_pubsub-2.15.2-py2.py3-none-any.whl (243 kB)
Collecting google-cloud-pubsublite<2,>=1.2.0
  Using cached google_cloud_pubsublite-1.7.0-py2.py3-none-any.whl (273 kB)
Collecting google-cloud-bigquery<4,>=2.0.0
  Using cached google_cloud_bigquery-3.9.0-py2.py3-none-any.whl (217 kB)
Collecting google-cloud-bigquery-storage<3,>=2.6.3
  Using cached google_cloud_bigquery_storage-2.19.1-py2.py3-none-any.whl (190 kB)
Collecting google-cloud-core<3,>=2.0.0
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<3,>=2.0.0
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0
  Using cached google_cloud_spanner-3.30.0-py2.py3-none-any.whl (327 kB)
Collecting google-cloud-dlp<4,>=3.0.0
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0
  Using cached google_cloud_language-2.9.1-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from azure-core>=1.7.0->apache-beam==2.48.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.107
  Using cached botocore-1.29.107-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3055084 sha256=8ff45d955d6eef3306d1066de546aa1c02becb123fe0e777866732d10795da7d
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.107 botocore-1.29.107 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.30.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.9 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "/home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/sdks/python/build/apache-beam.tar.gz" to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0406150852.1680794187.749028/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0406150852.1680794187.749028/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0406150852.1680794187.749028/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0406150852.1680794187.749028/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230406151627750157-5257'
 createTime: '2023-04-06T15:16:28.891867Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-06_08_16_28-4561084098787832401'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0406150852'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-06T15:16:28.891867Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-06_08_16_28-4561084098787832401]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-06_08_16_28-4561084098787832401
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-06_08_16_28-4561084098787832401?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-06_08_16_28-4561084098787832401 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:44.290Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:45.769Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:45.803Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:45.859Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:45.922Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:45.960Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.050Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.144Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.180Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.219Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.240Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.263Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.292Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.321Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.354Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.386Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.421Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.447Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.474Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.505Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.635Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.661Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.687Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.709Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:46.732Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:47.796Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:47.826Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:47.858Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-06_08_16_28-4561084098787832401 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:16:59.380Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:17:35.177Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:18:06.937Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:18:15.254Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-06T15:51:15.764Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-5' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285)
	at com.sun.proxy.$Proxy142.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207)
	at hudson.Launcher$ProcStarter.join(Launcher.java:524)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814)
	at hudson.model.Build$BuildExecution.build(Build.java:199)
	at hudson.model.Build$BuildExecution.doRun(Build.java:164)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522)
	at hudson.model.Run.execute(Run.java:1896)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44)
	at hudson.model.ResourceController.execute(ResourceController.java:101)
	at hudson.model.Executor.run(Executor.java:442)
Caused by: java.io.IOException: Pipe closed after 0 cycles
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126)
	at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105)
	at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:94)
	at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:75)
	at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:105)
	at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-5 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #955

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/955/display/redirect>

Changes:


------------------------------------------
[...truncated 26.06 KB...]
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<3,>=2.0.0
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0
  Using cached google_cloud_spanner-3.30.0-py2.py3-none-any.whl (327 kB)
Collecting google-cloud-dlp<4,>=3.0.0
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0
  Using cached google_cloud_language-2.9.1-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.106
  Using cached botocore-1.29.106-py3-none-any.whl (10.6 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3054447 sha256=adf64ee4fb0965b64d86dc7711d776f6ef250d7043b51c7b74deb70bbb60d1d0
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.106 botocore-1.29.106 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.30.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.9 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0405150558.1680707275.955354/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0405150558.1680707275.955354/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0405150558.1680707275.955354/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0405150558.1680707275.955354/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230405150755956374-3592'
 createTime: '2023-04-05T15:07:57.287Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-05_08_07_56-10923526522158027324'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0405150558'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-05T15:07:57.287Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-05_08_07_56-10923526522158027324]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-05_08_07_56-10923526522158027324
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-05_08_07_56-10923526522158027324?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-05_08_07_56-10923526522158027324 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:03.293Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:04.623Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:04.659Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:04.724Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:04.800Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:04.829Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:04.905Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:04.974Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.015Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.053Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.088Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.116Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.154Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.181Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.215Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.248Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.277Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.303Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.332Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.377Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.423Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.526Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.561Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.602Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.626Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:05.649Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:06.721Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:06.760Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:06.794Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-05_08_07_56-10923526522158027324 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:08:24.307Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:09:00.559Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:09:31.363Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:09:40.328Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:42:17.268Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:44:14.800Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T15:45:15.530Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T16:09:18.335Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T16:10:24.450Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T16:37:25.619Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T16:39:26.561Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T16:41:38.176Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T17:09:25.824Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T17:19:30.622Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T17:42:37.956Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T17:56:28.898Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T18:10:49.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T18:26:41.591Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T18:45:46.225Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T19:03:47.369Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T19:22:34.513Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T19:40:45.379Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T19:58:36.178Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T20:00:43.126Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-05_08_07_56-10923526522158027324.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T20:00:43.161Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T20:00:43.214Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T20:00:43.240Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T20:00:43.274Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-05T20:00:43.306Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-05_08_07_56-10923526522158027324 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-05_08_07_56-10923526522158027324?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/s5d3o7k5cgdjw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #954

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/954/display/redirect>

Changes:


------------------------------------------
[...truncated 27.29 KB...]
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting azure-storage-blob>=12.3.2
  Using cached azure_storage_blob-12.15.0-py3-none-any.whl (387 kB)
Collecting azure-core>=1.7.0
  Using cached azure_core-1.26.3-py3-none-any.whl (174 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.105
  Using cached botocore-1.29.105-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3054515 sha256=bb38cfad58e0088eaec2ac481ca04745be4c723b06963d93611de7fe7264d13b
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.105 botocore-1.29.105 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.30.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.9 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0404151224.1680622320.610274/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0404151224.1680622320.610274/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0404151224.1680622320.610274/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0404151224.1680622320.610274/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230404153200611340-5621'
 createTime: '2023-04-04T15:32:01.806321Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-04_08_32_01-393984547216670280'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0404151224'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-04T15:32:01.806321Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-04_08_32_01-393984547216670280]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-04_08_32_01-393984547216670280
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-04_08_32_01-393984547216670280?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-04_08_32_01-393984547216670280 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:08.890Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.186Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.210Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.282Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.351Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.380Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.449Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.507Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.549Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.576Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.607Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.638Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.663Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.690Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.726Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.760Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.794Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.828Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.852Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.883Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:10.916Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:11.039Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:11.075Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:11.112Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:11.143Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:11.175Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-04_08_32_01-393984547216670280 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:12.266Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:12.300Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:12.485Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:38.650Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:32:55.303Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:33:29.058Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T15:33:39.804Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T16:15:12.017Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T16:19:13.699Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T16:20:15.143Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T16:48:12.442Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T16:49:17.564Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T17:03:36.028Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T17:20:17.112Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T17:22:21.788Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T17:32:19.434Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T17:49:20.619Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T17:54:25.452Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T18:00:27.678Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T18:15:29.010Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T18:23:36.379Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T18:34:32.072Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T18:37:29.218Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T18:49:40.547Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T18:52:35.159Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T19:10:42.857Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T19:18:33.841Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T19:24:54.967Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T19:32:46.855Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T19:37:37.966Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T19:52:38.742Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T19:53:40.522Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-04_08_32_01-393984547216670280 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T20:01:10.441Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-04_08_32_01-393984547216670280.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T20:01:10.476Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T20:01:10.534Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T20:01:10.552Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T20:01:10.580Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-04T20:01:10.602Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-04_08_32_01-393984547216670280?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 32m 39s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/we625q3tqabac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #953

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/953/display/redirect>

Changes:


------------------------------------------
[...truncated 26.44 KB...]
  Using cached google_cloud_vision-3.4.1-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting azure-storage-blob>=12.3.2
  Using cached azure_storage_blob-12.15.0-py3-none-any.whl (387 kB)
Collecting azure-core>=1.7.0
  Using cached azure_core-1.26.3-py3-none-any.whl (174 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.104
  Using cached botocore-1.29.104-py3-none-any.whl (10.6 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3051428 sha256=dd8eb36ed790a5e53c8a53397527968c08380aac58cf3b9e15d2d6fb39b3a18b
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.104 botocore-1.29.104 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.30.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.9 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0403150226.1680534471.617738/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0403150226.1680534471.617738/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0403150226.1680534471.617738/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0403150226.1680534471.617738/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230403150751618770-5270'
 createTime: '2023-04-03T15:07:52.712208Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-03_08_07_52-5578538491809644285'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0403150226'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-03T15:07:52.712208Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-03_08_07_52-5578538491809644285]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-03_08_07_52-5578538491809644285
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-03_08_07_52-5578538491809644285?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-03_08_07_52-5578538491809644285 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:04.692Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.117Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.159Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.219Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.294Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.328Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.386Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.451Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.498Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.531Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.563Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.597Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.631Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.664Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.696Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.729Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.764Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.799Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.820Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.851Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.885Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:07.994Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:08.031Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:08.050Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:08.085Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:08.119Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-03_08_07_52-5578538491809644285 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:09.185Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:09.220Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:09.250Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:36.012Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:08:55.861Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:09:27.412Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:09:38.031Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:52:46.577Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:53:33.776Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T15:54:34.566Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T16:26:46.285Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T16:27:38.125Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T17:00:49.566Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T17:01:50.341Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T17:12:42.003Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T17:13:53.117Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T17:34:44.050Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T17:42:45.043Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T17:46:46.925Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T17:56:58.324Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T18:10:13.489Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T18:19:01.176Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T18:28:53.038Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T18:32:54.764Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T18:46:59.376Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T18:53:56.546Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T19:05:01.414Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T19:14:58.668Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T19:24:59.830Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T19:32:01.103Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T19:43:01.932Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T19:50:03.142Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-03_08_07_52-5578538491809644285 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T20:00:59.993Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-03_08_07_52-5578538491809644285.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T20:01:00.052Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T20:01:00.115Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T20:01:00.142Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T20:01:00.178Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-03T20:01:00.196Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-03_08_07_52-5578538491809644285?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 9s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4sjeh5h2besea

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #952

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/952/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #26006: Google Cloud libraries BOM 26.11.0


------------------------------------------
[...truncated 26.31 KB...]
  Using cached psycopg2_binary-2.9.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.70.1-py3-none-any.whl (403 kB)
Collecting boto3>=1.9
  Using cached boto3-1.26.104-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.104
  Using cached botocore-1.29.104-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3051428 sha256=aa23618a84b4adcb9ad8fd2c540c732fbf38502923746ea12f3f4557f65c4b17
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.104 botocore-1.29.104 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.9 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0402150156.1680448082.564263/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0402150156.1680448082.564263/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0402150156.1680448082.564263/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0402150156.1680448082.564263/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230402150802565420-1612'
 createTime: '2023-04-02T15:08:03.718496Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-02_08_08_03-2533546440906497658'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0402150156'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-02T15:08:03.718496Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-02_08_08_03-2533546440906497658]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-02_08_08_03-2533546440906497658
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-02_08_08_03-2533546440906497658?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-02_08_08_03-2533546440906497658 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:10.853Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:17.166Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.180Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.442Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.505Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.534Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.595Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.656Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.695Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.711Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.732Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.764Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.798Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.831Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.864Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.894Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.927Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.950Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.971Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:22.996Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:23.017Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:23.117Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:23.148Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:23.180Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:23.202Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:23.233Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-02_08_08_03-2533546440906497658 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:24.298Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:24.317Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:24.355Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:08:31.466Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:09:02.834Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:09:35.729Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:09:47.184Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:53:01.179Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:55:50.038Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T15:56:52.078Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T16:26:53.821Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T16:27:54.584Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T16:50:55.448Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T16:51:56.329Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T17:02:57.033Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T17:09:59.234Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T17:24:10.713Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T17:32:02.202Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T17:38:06.921Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T17:46:04.557Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T17:57:05.561Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T18:08:06.634Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T18:15:08.362Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T18:22:09.714Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T18:44:10.822Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T18:56:21.521Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T19:19:13.129Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T19:20:14.512Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T19:31:16.144Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T19:34:17.711Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T19:52:28.584Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T19:59:20.562Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-02_08_08_03-2533546440906497658 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T20:00:41.111Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-02_08_08_03-2533546440906497658.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T20:00:41.140Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T20:00:41.178Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T20:00:41.193Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T20:00:41.223Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-02T20:00:41.238Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-02_08_08_03-2533546440906497658?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 8s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/cbyhp3vlgoqrw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #951

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/951/display/redirect>

Changes:


------------------------------------------
[...truncated 26.91 KB...]
  Using cached hypothesis-6.70.1-py3-none-any.whl (403 kB)
Collecting boto3>=1.9
  Using cached boto3-1.26.104-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.104
  Using cached botocore-1.29.104-py3-none-any.whl (10.6 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3051428 sha256=bdcacc049766144de7f1661864088684017ed69b4c530f8a6e476b4263145290
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.104 botocore-1.29.104 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.9 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0401150151.1680361666.180166/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0401150151.1680361666.180166/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0401150151.1680361666.180166/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0401150151.1680361666.180166/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230401150746181154-4563'
 createTime: '2023-04-01T15:07:47.265923Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-04-01_08_07_46-6557315474715878263'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0401150151'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-04-01T15:07:47.265923Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-01_08_07_46-6557315474715878263]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-01_08_07_46-6557315474715878263
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-01_08_07_46-6557315474715878263?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-01_08_07_46-6557315474715878263 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:07:53.560Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:07:59.659Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:07:59.744Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:07:59.791Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:07:59.852Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:07:59.879Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:07:59.926Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:07:59.970Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.007Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.035Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.068Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.102Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.135Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.158Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.191Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.221Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.248Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.270Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.292Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.313Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.344Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.430Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.480Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.512Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.544Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:00.571Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:01.637Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:01.664Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:01.689Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-01_08_07_46-6557315474715878263 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:31.535Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:08:40.331Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:09:14.223Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:09:25.736Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:50:56.748Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:52:54.481Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:53:55.601Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T15:54:57.045Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T16:24:08.424Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T16:34:58.985Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T16:36:01.677Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T16:38:03.361Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T16:57:04.742Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T17:06:06.656Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T17:08:08.516Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T17:19:10.870Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T17:29:21.697Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T17:38:13.559Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T17:44:15.163Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T17:50:16.675Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T18:02:37.575Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T18:10:23.399Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T18:18:24.113Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T18:24:25.077Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T18:36:26.104Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T18:44:27.162Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T18:51:28.116Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T18:58:30.031Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T19:10:40.937Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T19:17:31.871Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T19:23:33.394Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T19:32:35.365Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T19:45:37.858Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T19:52:38.616Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T19:57:39.731Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-01_08_07_46-6557315474715878263 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T20:00:44.697Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-04-01_08_07_46-6557315474715878263.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T20:00:44.721Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T20:00:44.769Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T20:00:44.789Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T20:00:44.807Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-01T20:00:44.828Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-01_08_07_46-6557315474715878263?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 16s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/74psbltc4olzo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #950

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/950/display/redirect>

Changes:


------------------------------------------
[...truncated 26.59 KB...]
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.70.1-py3-none-any.whl (403 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.103
  Using cached botocore-1.29.103-py3-none-any.whl (10.6 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3051428 sha256=272aeabebea792c9178828a86635185215e74ccec0ef77921a6990596017a056
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.103 botocore-1.29.103 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.9 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0331150204.1680275265.097987/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0331150204.1680275265.097987/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0331150204.1680275265.097987/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0331150204.1680275265.097987/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230331150745098996-3954'
 createTime: '2023-03-31T15:07:46.234036Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-31_08_07_45-1690121513167711918'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0331150204'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-31T15:07:46.234036Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-31_08_07_45-1690121513167711918]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-31_08_07_45-1690121513167711918
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-31_08_07_45-1690121513167711918?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-31_08_07_45-1690121513167711918 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:58.045Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.198Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.228Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.293Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.368Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.392Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.455Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.496Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.538Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.572Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.593Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.627Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.657Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.690Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.714Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.745Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.765Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.786Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.828Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.850Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.954Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:07:59.989Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:08:00.016Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:08:00.045Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:08:00.068Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:08:01.157Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:08:01.180Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:08:01.217Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-31_08_07_45-1690121513167711918 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:08:28.945Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:08:43.569Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:09:09.238Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:09:17.342Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:42:28.197Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:43:29.440Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T15:54:30.934Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T16:09:28.874Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T16:15:33.560Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T16:26:30.675Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T16:29:31.596Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T16:44:32.876Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T16:51:33.794Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T17:02:34.872Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T17:09:36.152Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T17:20:37.778Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T17:28:40.005Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T17:44:41.400Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T17:47:42.196Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T17:57:43.005Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T18:04:43.850Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T18:17:45.226Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T18:26:56.148Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T18:32:47.057Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T18:41:48.632Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T18:52:59.780Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T19:03:51.179Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T19:08:51.946Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T19:19:13.212Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T19:27:54.146Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T19:38:55.255Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T19:46:56.006Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T19:55:57.147Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T20:00:41.331Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-31_08_07_45-1690121513167711918.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T20:00:41.353Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T20:00:41.433Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T20:00:41.453Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T20:00:41.485Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-31T20:00:41.507Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-31_08_07_45-1690121513167711918 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-31_08_07_45-1690121513167711918?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 19s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ukn2yywdxuwyc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #949

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/949/display/redirect>

Changes:


------------------------------------------
[...truncated 26.70 KB...]
  Using cached azure_core-1.26.3-py3-none-any.whl (174 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.102
  Using cached botocore-1.29.102-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3049766 sha256=ce8ec1dc871279ed0c22539b27183e6e790bbb0385d038a7904182ba0a642e32
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.102 botocore-1.29.102 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.9 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230327" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0330150248.1680188895.894047/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0330150248.1680188895.894047/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0330150248.1680188895.894047/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0330150248.1680188895.894047/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230330150815895259-6165'
 createTime: '2023-03-30T15:08:17.011170Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-30_08_08_16-14519381636284639095'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0330150248'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-30T15:08:17.011170Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-30_08_08_16-14519381636284639095]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-30_08_08_16-14519381636284639095
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-30_08_08_16-14519381636284639095?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-30_08_08_16-14519381636284639095 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:24.543Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:25.911Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:25.940Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.019Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.089Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.131Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.196Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.261Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.304Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.335Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.359Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.392Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.425Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.457Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.493Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.528Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.561Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.586Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.607Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.635Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.694Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.788Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.817Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.847Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.868Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:26.890Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-30_08_08_16-14519381636284639095 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:27.971Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:27.994Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:08:28.025Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:09:00.818Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:09:20.921Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:09:53.450Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:10:04.070Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:52:37.627Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:55:34.677Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T15:56:36.203Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T16:25:37.510Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T16:27:38.812Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T17:00:51.368Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T17:06:53.196Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T17:32:54.533Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T17:34:45.865Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T17:43:47.407Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T17:45:48.225Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T18:06:49.465Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T18:15:50.383Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T18:19:51.474Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T18:30:53.004Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T18:43:54.815Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T18:51:55.835Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T19:01:56.684Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T19:09:07.520Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T19:19:58.518Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T19:28:03.278Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T19:38:01.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T19:46:03.642Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T19:57:05.020Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T20:04:07.221Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T20:15:09.202Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T20:22:10.017Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T20:40:11.086Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T20:41:11.783Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T20:59:22.804Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T21:05:13.660Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-30_08_08_16-14519381636284639095 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T21:08:59.109Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-30_08_08_16-14519381636284639095.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T21:08:59.147Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T21:08:59.202Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T21:08:59.220Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T21:08:59.257Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-30T21:08:59.282Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-30_08_08_16-14519381636284639095?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6h 3m 29s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/rlap4q53qsc7m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #948

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/948/display/redirect>

Changes:


------------------------------------------
[...truncated 26.26 KB...]
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.4.47-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.9.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.70.1-py3-none-any.whl (403 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.101
  Using cached botocore-1.29.101-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3049665 sha256=0203a6f5b0e5768fd03fd63fda48e3a31e7061bccaae27656a0b1870e751f2c5
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.101 botocore-1.29.101 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.9.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.9 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0329125348.1680104383.085788/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0329125348.1680104383.085788/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0329125348.1680104383.085788/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0329125348.1680104383.085788/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230329153943086835-7545'
 createTime: '2023-03-29T15:39:44.288293Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-29_08_39_43-15443986007407330716'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0329125348'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-29T15:39:44.288293Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-29_08_39_43-15443986007407330716]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-29_08_39_43-15443986007407330716
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-29_08_39_43-15443986007407330716?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-29_08_39_43-15443986007407330716 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:51.147Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:57.595Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:57.927Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:57.995Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.063Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.091Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.160Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.208Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.246Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.275Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.300Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.327Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.362Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.397Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.430Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.495Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.529Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.555Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.577Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.599Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.688Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.714Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.733Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.757Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:58.784Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-29_08_39_43-15443986007407330716 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:59.840Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:59.857Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:39:59.896Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:40:00.416Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:40:38.737Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:41:10.504Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T15:41:21.548Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T16:23:26.878Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T16:25:24.034Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T16:27:34.953Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T16:55:26.093Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T16:57:27.131Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T17:09:28.179Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T17:21:29.189Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T17:27:40.045Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T17:38:31.382Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T17:45:32.141Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T17:56:44.433Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T18:04:06.540Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T18:14:37.373Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T18:22:38.408Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T18:33:49.282Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T18:41:40.525Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T18:51:51.422Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T18:59:42.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T19:11:03.450Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T19:18:55.115Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T19:27:56.286Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T19:37:47.468Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T19:46:48.326Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T19:56:50.256Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T20:00:39.166Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-29_08_39_43-15443986007407330716.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T20:00:39.200Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T20:00:39.236Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T20:00:39.256Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T20:00:39.276Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-29T20:00:39.302Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-29_08_39_43-15443986007407330716 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-29_08_39_43-15443986007407330716?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 23m 35s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/nixbakbsm7tio

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #947

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/947/display/redirect>

Changes:


------------------------------------------
[...truncated 25.16 KB...]
Collecting mock<6.0.0,>=1.0.1
  Using cached mock-5.0.1-py3-none-any.whl (30 kB)
Collecting parameterized<0.9.0,>=0.7.1
  Using cached parameterized-0.8.1-py2.py3-none-any.whl (26 kB)
Collecting pyhamcrest!=1.10.0,<2.0.0,>=1.9
  Using cached PyHamcrest-1.10.1-py3-none-any.whl (48 kB)
Collecting pyyaml<7.0.0,>=3.12
  Using cached PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (596 kB)
Collecting requests_mock<2.0,>=1.7
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting pytest<8.0,>=7.1.2
  Using cached pytest-7.2.2-py3-none-any.whl (317 kB)
Collecting pytest-xdist<4,>=2.5.0
  Using cached pytest_xdist-3.2.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.4.47-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.9.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-40.0.1-cp36-abi3-manylinux_2_28_x86_64.whl (3.7 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.70.1-py3-none-any.whl (403 kB)
Collecting boto3>=1.9
  Using cached boto3-1.26.100-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.100
  Using cached botocore-1.29.100-py3-none-any.whl (10.6 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.53.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Requirement already satisfied: pluggy<2.0,>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in /home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3049665 sha256=8b872c97102fe47adc5a1ec7d7aeffb081749b88254fd6e28763cacad37d2a5c
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.100 botocore-1.29.100 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.8.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.53.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.8 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.2 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "/home/jenkins/jenkins-slave/workspace/beam_LoadTests_Python_Combine_Dataflow_Streaming/src/sdks/python/build/apache-beam.tar.gz" to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0328150215.1680016068.204805/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0328150215.1680016068.204805/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0328150215.1680016068.204805/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0328150215.1680016068.204805/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230328150748205764-6170'
 createTime: '2023-03-28T15:07:49.412387Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-28_08_07_48-5951913282682976877'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0328150215'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-28T15:07:49.412387Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-28_08_07_48-5951913282682976877]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-28_08_07_48-5951913282682976877
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-28_08_07_48-5951913282682976877?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-28_08_07_48-5951913282682976877 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:05.101Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.426Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.460Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.535Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.619Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.646Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.704Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.771Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.814Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.850Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.882Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.916Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.938Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:06.969Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.004Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.038Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.060Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.084Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.115Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.158Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.193Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.276Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.317Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.347Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.381Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:07.405Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:08.467Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:08.499Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:08.545Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-28_08_07_48-5951913282682976877 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:38.796Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:08:53.749Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:09:24.345Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:09:34.478Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:44:17.163Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:52:13.282Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T15:53:10.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T16:09:15.486Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T16:19:12.586Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T16:24:13.631Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T16:35:16.483Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T16:47:17.312Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T16:54:28.499Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T17:08:33.408Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T17:24:20.563Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T17:27:22.505Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T17:38:34.463Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T17:43:25.990Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T17:59:27.332Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T18:06:28.330Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T18:14:29.337Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T18:20:30.195Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T18:33:31.234Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-28T18:35:32.290Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-15' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285)
	at com.sun.proxy.$Proxy142.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207)
	at hudson.Launcher$ProcStarter.join(Launcher.java:524)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814)
	at hudson.model.Build$BuildExecution.build(Build.java:199)
	at hudson.model.Build$BuildExecution.doRun(Build.java:164)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522)
	at hudson.model.Run.execute(Run.java:1896)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44)
	at hudson.model.ResourceController.execute(ResourceController.java:101)
	at hudson.model.Executor.run(Executor.java:442)
Caused by: hudson.remoting.Channel$OrderlyShutdown: Command Close created at
	at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1313)
	at hudson.remoting.Channel$1.handle(Channel.java:606)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:81)
Caused by: Command Close created at
	at hudson.remoting.Command.<init>(Command.java:70)
	at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1306)
	at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1304)
	at hudson.remoting.Channel.close(Channel.java:1480)
	at hudson.remoting.Channel.close(Channel.java:1447)
	at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1312)
	... 2 more
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-15 is offline; cannot locate jdk_1.8_latest

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #946

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/946/display/redirect?page=changes>

Changes:

[noreply] Bump webpack (#25846)


------------------------------------------
[...truncated 25.77 KB...]
  Using cached google_cloud_bigquery_storage-2.19.0-py2.py3-none-any.whl (190 kB)
Collecting google-cloud-core<3,>=2.0.0
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<3,>=2.0.0
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0
  Using cached google_cloud_spanner-3.29.0-py2.py3-none-any.whl (327 kB)
Collecting google-cloud-dlp<4,>=3.0.0
  Using cached google_cloud_dlp-3.12.0-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0
  Using cached google_cloud_language-2.9.0-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0
  Using cached google_cloud_videointelligence-2.11.1-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2
  Using cached google_cloud_vision-3.4.0-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.2-py2.py3-none-any.whl (173 kB)
Collecting boto3>=1.9
  Using cached boto3-1.26.99-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.99
  Using cached botocore-1.29.99-py3-none-any.whl (10.5 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3043795 sha256=e5fb0be4c66e93f34d755de067c6e62d2c022b73763ec35db6e0474fabc2a9fe
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.99 botocore-1.29.99 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.8.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.8 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.2 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0327142418.1679929676.011043/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0327142418.1679929676.011043/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0327142418.1679929676.011043/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0327142418.1679929676.011043/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230327150756012190-4226'
 createTime: '2023-03-27T15:07:57.127870Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-27_08_07_56-371986831622660924'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0327142418'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-27T15:07:57.127870Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-27_08_07_56-371986831622660924]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-27_08_07_56-371986831622660924
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-27_08_07_56-371986831622660924?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-27_08_07_56-371986831622660924 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:07.456Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:08.965Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:08.998Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.061Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.122Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.150Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.214Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.284Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.322Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.359Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.394Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.416Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.447Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.469Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.493Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.527Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.559Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.627Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.651Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.674Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.776Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.818Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.846Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.873Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:09.907Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:11.004Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:11.031Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:11.080Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-27_08_07_56-371986831622660924 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:08:40.836Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:09:04.708Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:09:45.418Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:09:57.962Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T15:45:11.240Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T16:29:18.902Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T16:33:29.147Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T17:01:20.859Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T17:11:22.015Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T17:34:33.133Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T17:45:11.241Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T17:48:24.034Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T18:30:25.746Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T18:33:25.985Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T18:56:29.296Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T19:10:40.209Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T19:32:31.322Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T19:48:44.039Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T20:00:16.591Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T20:00:33.911Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-27_08_07_56-371986831622660924.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T20:00:33.940Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T20:00:33.995Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T20:00:34.014Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T20:00:34.041Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-27T20:00:34.062Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-27_08_07_56-371986831622660924 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-27_08_07_56-371986831622660924?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 58s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lxfgwzydom7by

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #945

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/945/display/redirect>

Changes:


------------------------------------------
[...truncated 26.47 KB...]
  Using cached google_cloud_dlp-3.12.0-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0
  Using cached google_cloud_language-2.9.0-py2.py3-none-any.whl (99 kB)
Collecting google-cloud-videointelligence<3,>=2.0
  Using cached google_cloud_videointelligence-2.11.0-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2
  Using cached google_cloud_vision-3.4.0-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.2-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.99
  Using cached botocore-1.29.99-py3-none-any.whl (10.5 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3043795 sha256=0e74424e997958b0952c8cdcf141c39fc30aa9ce0476bd926a9f85e98abb7363
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.99 botocore-1.29.99 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.7.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.0 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.0 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.8 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.2 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0326150156.1679843283.932706/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0326150156.1679843283.932706/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0326150156.1679843283.932706/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0326150156.1679843283.932706/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230326150803933855-8892'
 createTime: '2023-03-26T15:08:05.061421Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-26_08_08_04-9433610547983021201'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0326150156'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-26T15:08:05.061421Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-26_08_08_04-9433610547983021201]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-26_08_08_04-9433610547983021201
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-26_08_08_04-9433610547983021201?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-26_08_08_04-9433610547983021201 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:17.082Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:23.397Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:28.434Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.501Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.566Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.594Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.659Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.717Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.750Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.779Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.802Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.831Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.884Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.906Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.933Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.965Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:33.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:34.033Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:34.062Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:34.091Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:34.226Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:34.256Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:34.285Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:34.318Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:34.342Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:35.405Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:35.431Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:35.477Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-26_08_08_04-9433610547983021201 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:08:55.761Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:09:13.510Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:09:51.949Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:10:02.594Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:52:12.419Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:54:00.047Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T15:55:01.242Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T16:24:02.718Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T16:26:03.145Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T16:57:04.613Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T17:08:15.533Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T17:30:07.657Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T17:31:09.024Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T17:42:09.483Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T17:45:10.157Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T18:04:11.780Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T18:12:12.920Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T18:18:14.478Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T18:26:25.252Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T18:37:16.234Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T18:48:17.318Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T18:55:18.457Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T19:02:19.348Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T19:13:20.678Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T19:14:22.462Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T19:29:23.372Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T19:36:24.127Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T19:47:25.577Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T19:56:27.174Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-26_08_08_04-9433610547983021201 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T20:00:40.150Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-26_08_08_04-9433610547983021201.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T20:00:40.187Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T20:00:40.245Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T20:00:40.269Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T20:00:40.296Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-26T20:00:40.313Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-26_08_08_04-9433610547983021201?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 12s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/slw3hkqoieq6o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #944

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/944/display/redirect>

Changes:


------------------------------------------
[...truncated 26.66 KB...]
  Using cached google_cloud_vision-3.4.0-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.2-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.99
  Using cached botocore-1.29.99-py3-none-any.whl (10.5 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3043795 sha256=bff411e226704c5471e77e30734f036a3ccfffad6c6700b1435f88f2c92f89a8
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.99 botocore-1.29.99 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.7.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.0 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.0 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.8 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.2 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0325150157.1679756862.074695/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0325150157.1679756862.074695/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0325150157.1679756862.074695/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0325150157.1679756862.074695/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230325150742075647-1216'
 createTime: '2023-03-25T15:07:43.125346Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-25_08_07_42-9127055728805385722'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0325150157'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-25T15:07:43.125346Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-25_08_07_42-9127055728805385722]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-25_08_07_42-9127055728805385722
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-25_08_07_42-9127055728805385722?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-25_08_07_42-9127055728805385722 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:48.381Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.040Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.070Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.137Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.230Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.259Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.326Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.393Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.425Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.457Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.488Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.523Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.549Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.578Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.611Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.638Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.660Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.683Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.714Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.747Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.772Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.869Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.918Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.939Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.965Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:50.995Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:52.066Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:52.090Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:07:52.126Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-25_08_07_42-9127055728805385722 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:08:24.379Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:08:31.285Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:09:05.682Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:09:13.966Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:42:19.095Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:53:16.983Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T15:54:18.283Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T16:14:19.145Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T16:18:20.841Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T16:25:32.352Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T16:27:22.696Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T16:46:23.562Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T16:53:24.297Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T16:58:25.738Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T17:05:26.659Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T17:19:27.648Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T17:25:28.266Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T17:33:40.053Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T17:41:30.866Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T17:50:32.286Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T17:51:34.510Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T18:07:36.044Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T18:15:37.018Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T18:16:40.564Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T18:29:41.521Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T18:40:42.408Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T18:48:44.405Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T18:55:45.955Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T19:02:47.155Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T19:15:47.968Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T19:21:48.786Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T19:28:59.683Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T19:37:51.514Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T19:39:52.746Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T19:54:53.772Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-25_08_07_42-9127055728805385722 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T20:00:58.472Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-25_08_07_42-9127055728805385722.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T20:00:58.506Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T20:00:58.552Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T20:00:58.579Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T20:00:58.606Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-25T20:00:58.623Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-25_08_07_42-9127055728805385722?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/shrpthgo5hugi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #943

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/943/display/redirect>

Changes:


------------------------------------------
[...truncated 26.73 KB...]
  Using cached boto3-1.26.98-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.98
  Using cached botocore-1.29.98-py3-none-any.whl (10.5 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3043503 sha256=334f6a5f7165734118e0b84cfa94d32ccaf7955bb2c7b7f114dc885666f22bdd
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.98 botocore-1.29.98 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.0 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.7.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.0 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.29.0 google-cloud-videointelligence-2.11.0 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.8 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230322" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0324150158.1679670472.502075/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0324150158.1679670472.502075/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0324150158.1679670472.502075/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0324150158.1679670472.502075/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230324150752503140-7237'
 createTime: '2023-03-24T15:07:53.641614Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-24_08_07_53-13958841709548414488'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0324150158'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-24T15:07:53.641614Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-24_08_07_53-13958841709548414488]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-24_08_07_53-13958841709548414488
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-24_08_07_53-13958841709548414488?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-24_08_07_53-13958841709548414488 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:05.191Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.533Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.563Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.628Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.704Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.731Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.788Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.856Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.904Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.937Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:07.969Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.037Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.064Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.100Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.134Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.167Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.204Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.230Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.253Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.291Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.403Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.446Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.480Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.515Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:08.539Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-24_08_07_53-13958841709548414488 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:09.622Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:09.661Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:09.706Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:21.870Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:53.006Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:08:53.037Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:09:02.719Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:09:29.507Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:09:37.674Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:42:16.443Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:53:07.537Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T15:54:18.566Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T16:15:09.922Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T16:25:14.339Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T16:27:11.496Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T16:36:23.759Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T16:40:14.177Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T16:57:14.158Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T17:05:16.053Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T17:09:16.877Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T17:18:25.792Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T17:24:31.622Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T17:37:32.458Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T17:44:22.398Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T17:52:25.066Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T18:02:26.697Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T18:12:27.791Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T18:18:29.098Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T18:24:30.378Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T18:28:32.124Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T18:46:33.903Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T18:54:54.759Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T18:57:35.846Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T19:06:38Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T19:18:39.768Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T19:29:40.468Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T19:34:41.894Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T19:40:43.027Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T19:52:44.496Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T19:55:46.360Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T20:00:42.631Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-24_08_07_53-13958841709548414488.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T20:00:42.669Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-24_08_07_53-13958841709548414488 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T20:00:42.726Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T20:00:42.749Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T20:00:42.779Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-24T20:00:42.803Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-24_08_07_53-13958841709548414488?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/tvn74b2kdmueq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #942

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/942/display/redirect>

Changes:


------------------------------------------
[...truncated 26.45 KB...]
Collecting google-cloud-vision<4,>=2
  Using cached google_cloud_vision-3.4.0-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.2-py2.py3-none-any.whl (173 kB)
Collecting azure-storage-blob>=12.3.2
  Using cached azure_storage_blob-12.15.0-py3-none-any.whl (387 kB)
Collecting azure-core>=1.7.0
  Using cached azure_core-1.26.3-py3-none-any.whl (174 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Collecting boto3>=1.9
  Using cached boto3-1.26.97-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.97
  Using cached botocore-1.29.97-py3-none-any.whl (10.5 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3042809 sha256=da626236f519dc00d2f994b2b42cb3bc604539e43983bbd69d014f37fb88d72c
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.97 botocore-1.29.97 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.7.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.0 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.2 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.28.0 google-cloud-videointelligence-2.11.0 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.8 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0323150211.1679584210.737873/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0323150211.1679584210.737873/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0323150211.1679584210.737873/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0323150211.1679584210.737873/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230323151010739705-6267'
 createTime: '2023-03-23T15:10:12.230865Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-23_08_10_11-10884996483079570072'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0323150211'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-23T15:10:12.230865Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-23_08_10_11-10884996483079570072]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-23_08_10_11-10884996483079570072
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-23_08_10_11-10884996483079570072?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-23_08_10_11-10884996483079570072 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:19.447Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.036Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.076Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.146Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.195Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.222Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.281Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.346Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.390Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.446Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.477Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.508Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.543Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.575Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.608Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.640Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.660Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.683Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.716Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.751Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.777Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.886Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.922Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.952Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:21.983Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:22.017Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-23_08_10_11-10884996483079570072 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:23.102Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:23.137Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:23.165Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:10:53.289Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:11:03.197Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:11:44.913Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:11:55.838Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:53:50.513Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:55:47.887Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T15:56:48.976Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T16:24:50.363Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T16:27:51.500Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T16:58:52.219Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T17:10:03.219Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T17:33:54.451Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T17:44:55.424Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T17:58:56.988Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T18:08:58.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T18:17:59.578Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T18:33:00.727Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T18:34:03.364Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T18:52:05.111Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T19:06:06.845Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T19:16:07.859Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T19:17:38.998Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T19:28:20.907Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T19:39:12.847Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T19:48:14.364Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T19:57:16.510Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T20:01:03.936Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-23_08_10_11-10884996483079570072.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T20:01:03.967Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T20:01:04.037Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T20:01:04.079Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T20:01:04.101Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-23T20:01:04.122Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-23_08_10_11-10884996483079570072 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-23_08_10_11-10884996483079570072?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 41s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/ok35v3tmtvqyc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #941

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/941/display/redirect>

Changes:


------------------------------------------
[...truncated 26.36 KB...]
Collecting google-cloud-videointelligence<3,>=2.0
  Using cached google_cloud_videointelligence-2.11.0-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2
  Using cached google_cloud_vision-3.4.0-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.10.2-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.96
  Using cached botocore-1.29.96-py3-none-any.whl (10.5 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3042809 sha256=c5487eba0c72514e0b2e853f0bb6c5e153a1cb8ac94c72a91591a50a906f27c5
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.96 botocore-1.29.96 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.7.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.0 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.28.0 google-cloud-videointelligence-2.11.0 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.8 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0322150202.1679497674.735114/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0322150202.1679497674.735114/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0322150202.1679497674.735114/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0322150202.1679497674.735114/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230322150754736164-9782'
 createTime: '2023-03-22T15:07:55.779091Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-22_08_07_55-3123863398541312888'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0322150202'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-22T15:07:55.779091Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-22_08_07_55-3123863398541312888]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-22_08_07_55-3123863398541312888
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-22_08_07_55-3123863398541312888?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-22_08_07_55-3123863398541312888 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:06.019Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:07.735Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:07.771Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:07.826Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:07.880Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:07.902Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:07.960Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.004Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.046Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.078Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.115Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.139Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.160Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.192Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.227Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.287Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.325Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.352Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.387Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.420Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.507Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.534Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.567Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.600Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:08.636Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:09.701Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:09.733Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:09.762Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-22_08_07_55-3123863398541312888 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:25.653Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:08:53.843Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:09:37.433Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:09:45.409Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:50:11.026Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:55:08.552Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T15:56:19.723Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T16:15:10.546Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T16:21:11.290Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T16:37:12.712Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T16:49:18.242Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T16:50:15.310Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T16:56:26.232Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T17:11:17.011Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T17:17:18.615Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T17:30:23.584Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T17:34:21.268Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T17:55:22.323Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T17:56:23.304Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T18:07:24.214Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T18:14:25.273Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T18:27:37.615Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T18:36:48.763Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T18:44:31.383Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T18:50:32.184Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T19:01:34.393Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T19:12:36.151Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T19:19:37.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T19:26:38.981Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T19:37:40.928Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T19:38:42.497Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T19:53:43.548Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-22_08_07_55-3123863398541312888 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T20:00:44.337Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-22_08_07_55-3123863398541312888.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T20:00:44.368Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T20:00:44.424Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T20:00:44.445Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T20:00:44.467Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-22T20:00:44.489Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-22_08_07_55-3123863398541312888?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 8s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jdsb3sujjqeh2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #940

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/940/display/redirect>

Changes:


------------------------------------------
[...truncated 26.39 KB...]
Collecting cryptography>=36.0.0
  Using cached cryptography-39.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (4.2 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.70.0-py3-none-any.whl (403 kB)
Collecting azure-storage-blob>=12.3.2
  Using cached azure_storage_blob-12.15.0-py3-none-any.whl (387 kB)
Collecting azure-core>=1.7.0
  Using cached azure_core-1.26.3-py3-none-any.whl (174 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.95
  Using cached botocore-1.29.95-py3-none-any.whl (10.5 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3041044 sha256=321c0f11f642f471cd67e235c17a82de1322c50da51eeedc3bc9a7f5d8c078e3
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.95 botocore-1.29.95 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.7.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.0 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.28.0 google-cloud-videointelligence-2.11.0 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.8 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0321150157.1679411262.408407/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0321150157.1679411262.408407/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0321150157.1679411262.408407/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0321150157.1679411262.408407/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230321150742409441-8754'
 createTime: '2023-03-21T15:07:46.228744Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-21_08_07_43-9109527523852948691'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0321150157'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-21T15:07:46.228744Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-21_08_07_43-9109527523852948691]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-21_08_07_43-9109527523852948691
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-21_08_07_43-9109527523852948691?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-21_08_07_43-9109527523852948691 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:52.876Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.262Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.282Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.346Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.416Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.454Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.515Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.584Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.624Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.660Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.688Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.721Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.755Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.821Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.853Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.877Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.911Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.943Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.978Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:54.999Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:55.097Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:55.130Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:55.158Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:55.178Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:55.203Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:56.289Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:56.312Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:07:56.359Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-21_08_07_43-9109527523852948691 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:08:07.358Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:08:47.285Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:09:27.617Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:09:38.764Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:52:04.469Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:54:01.699Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T15:56:13.707Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T16:25:14.886Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T16:26:05.821Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T16:58:07.158Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T16:59:07.666Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T17:06:09.391Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T17:09:10.562Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T17:33:11.663Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T17:40:12.879Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T18:07:13.997Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T18:09:15.215Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T18:14:26.688Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T18:17:17.629Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T18:40:19.195Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T18:53:20.570Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T18:54:22.488Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T18:56:25.276Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T19:13:26.615Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T19:26:30.392Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T19:38:28.661Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T19:47:30.460Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T19:49:31.476Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-21_08_07_43-9109527523852948691 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T20:00:54.318Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-21_08_07_43-9109527523852948691.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T20:00:54.363Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T20:00:54.427Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T20:00:54.456Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T20:00:54.478Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-21T20:00:54.508Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-21_08_07_43-9109527523852948691?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/yanessvmofisc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #939

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/939/display/redirect>

Changes:


------------------------------------------
[...truncated 26.17 KB...]
  Using cached SQLAlchemy-1.4.47-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.9.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-39.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (4.2 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.70.0-py3-none-any.whl (403 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.94
  Using cached botocore-1.29.94-py3-none-any.whl (10.5 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3041044 sha256=ec3296fe6e75874a59325ca4fd7ad50c947088bf093995b080260e02908ffec4
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.94 botocore-1.29.94 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.7.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.0 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.28.0 google-cloud-videointelligence-2.11.0 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.7 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0320150200.1679324868.275141/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0320150200.1679324868.275141/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0320150200.1679324868.275141/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0320150200.1679324868.275141/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230320150748276176-3371'
 createTime: '2023-03-20T15:07:49.918077Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-20_08_07_48-17328042593445841494'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0320150200'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-20T15:07:49.918077Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-20_08_07_48-17328042593445841494]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-20_08_07_48-17328042593445841494
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-20_08_07_48-17328042593445841494?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-20_08_07_48-17328042593445841494 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:54.968Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.530Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.554Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.618Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.677Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.715Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.780Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.847Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.894Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.926Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.958Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:56.981Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.035Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.056Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.087Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.119Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.151Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.184Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.218Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.246Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.341Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.375Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.405Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.438Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:57.472Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:58.545Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:58.584Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:07:58.615Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-20_08_07_48-17328042593445841494 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:08:32.827Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:08:41.595Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:08:41.627Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:08:51.335Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:09:16.603Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:09:23.936Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:50:54.945Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:54:59.487Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T15:55:56.755Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T16:16:01.598Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T16:22:58.487Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T16:36:59.284Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T16:48:00.611Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T17:00:11.349Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T17:02:03.203Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T17:13:04.215Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T17:24:05.473Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T17:35:16.544Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T17:41:07.304Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T17:49:22.761Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T18:08:10.100Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T18:10:11.560Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T18:25:16.694Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T18:30:13.981Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T18:49:15.621Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T19:00:26.584Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T19:22:17.774Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T19:37:29.462Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T19:56:21.010Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T20:00:41.378Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-20_08_07_48-17328042593445841494.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T20:00:41.404Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T20:00:41.457Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T20:00:41.516Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T20:00:41.537Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-20T20:00:41.560Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-20_08_07_48-17328042593445841494 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-20_08_07_48-17328042593445841494?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 15s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/pssc2vtqmdcjs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #938

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/938/display/redirect>

Changes:


------------------------------------------
[...truncated 26.10 KB...]
  Using cached pytest-7.2.2-py3-none-any.whl (317 kB)
Collecting pytest-xdist<4,>=2.5.0
  Using cached pytest_xdist-3.2.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.4.47-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.9.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0
  Using cached cryptography-39.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (4.2 MB)
Collecting hypothesis<=7.0.0,>5.0.0
  Using cached hypothesis-6.70.0-py3-none-any.whl (403 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.47.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.94
  Using cached botocore-1.29.94-py3-none-any.whl (10.5 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.47.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.3-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting dnspython<3.0.0,>=1.16.0
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (2.1.3)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (1.0.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.47.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.47.0.dev0-py3-none-any.whl size=3041044 sha256=684f50ea29f841aafe5145dff5067fcfc968f58a8411b1b32c060c8a1a7ac05a
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pymongo, pydot, pandas, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-cloud-core, google-apitools, boto3, azure-storage-blob, apache-beam, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.47.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.15.0 boto3-1.26.94 botocore-1.29.94 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.2 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.7.0 google-cloud-bigquery-storage-2.19.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.0 google-cloud-dlp-3.12.0 google-cloud-language-2.9.0 google-cloud-pubsub-2.15.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.2 google-cloud-spanner-3.28.0 google-cloud-videointelligence-2.11.0 google-cloud-vision-3.4.0 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.51.3 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.70.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.1 msal-1.21.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.7 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.2 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.20.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.47.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0319150154.1679238468.304717/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0319150154.1679238468.304717/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0319150154.1679238468.304717/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0319150154.1679238468.304717/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230319150748305726-1861'
 createTime: '2023-03-19T15:07:49.428106Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-03-19_08_07_48-16374305742827999748'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0319150154'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-03-19T15:07:49.428106Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-03-19_08_07_48-16374305742827999748]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-03-19_08_07_48-16374305742827999748
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-03-19_08_07_48-16374305742827999748?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-19_08_07_48-16374305742827999748 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:56.776Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:57.998Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.027Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.079Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.170Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.198Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.244Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.290Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.317Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.411Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.435Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.456Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.479Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.503Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.524Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.545Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.567Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.591Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.613Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.668Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.742Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.777Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.798Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.819Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:58.841Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-19_08_07_48-16374305742827999748 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:59.901Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:59.922Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:07:59.959Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:08:23.455Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:08:37.385Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:09:10.254Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:09:21.537Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:52:54.911Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:54:52.863Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T15:55:54.028Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T16:24:55.069Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T16:27:05.729Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T16:57:56.878Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T17:07:59.038Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T17:29:00.152Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T17:43:01.083Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T17:54:02.872Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T18:09:07.695Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T18:27:14.966Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T18:45:05.927Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T19:03:06.839Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T19:15:11.688Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T19:36:08.789Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T19:43:09.916Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-03-19_08_07_48-16374305742827999748 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T20:00:42.484Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-03-19_08_07_48-16374305742827999748.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T20:00:42.563Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T20:00:42.604Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T20:00:42.631Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T20:00:42.653Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-03-19T20:00:42.675Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-03-19_08_07_48-16374305742827999748?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 10s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/l7jlzvskmwh6o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org