You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/06/02 00:02:12 UTC

Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1012

See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1012/display/redirect>

Changes:


------------------------------------------
[...truncated 35.14 KB...]
  Using cached botocore-1.29.144-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Downloading docker-6.1.3-py3-none-any.whl (148 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 148.1/148.1 kB 4.0 MB/s eta 0:00:00
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080577 sha256=65d8042fef003b9cc6c478d443222ec20285be7fee834277a52fab1bdff1b580
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.144 botocore-1.29.144 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.9 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.14 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0601132530.1685632462.782840/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0601132530.1685632462.782840/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0601132530.1685632462.782840/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0601132530.1685632462.782840/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230601151422784358-5369'
 createTime: '2023-06-01T15:14:25.418356Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-01_08_14_24-4537209592619833224'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0601132530'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-01T15:14:25.418356Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-01_08_14_24-4537209592619833224]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-01_08_14_24-4537209592619833224
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-01_08_14_24-4537209592619833224?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-01_08_14_24-4537209592619833224 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:29.890Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.137Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.166Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.282Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.383Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.423Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.509Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.579Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.752Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.808Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.849Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.883Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.907Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.939Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:31.964Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.037Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.066Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.104Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.150Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.197Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.312Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.397Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.437Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.470Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.531Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.789Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.831Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:32.909Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-01_08_14_24-4537209592619833224 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:14:53.266Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:24.277Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:24.298Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 3, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:34.181Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:34.220Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:44.220Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:15:54.271Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:16:01.511Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T15:39:29.799Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:09:57.704Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:11:01.240Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:35:59.250Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:37:04.237Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T16:38:05.253Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:03:03.270Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:04:04.158Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:06:05.894Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:29:08.143Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:43:09.646Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T17:55:10.161Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T18:17:11.773Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T18:21:16.524Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T18:42:13.805Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T18:47:18.384Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:07:20.098Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:12:17.430Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:32:22.045Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:40:23.235Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T19:58:22.064Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T20:07:23.854Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T20:33:25.980Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T20:34:26.756Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T20:58:29.171Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T21:00:54.947Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T21:22:42.492Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T21:46:44.540Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T22:13:35.918Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T22:30:40.879Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T22:39:38.220Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T22:56:39.392Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:05:41.339Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:22:42.225Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:31:44.192Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:48:48.968Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-01T23:57:49.779Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-01_08_14_24-4537209592619833224 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.459Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-01_08_14_24-4537209592619833224.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.508Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.602Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.628Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.652Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T00:01:15.683Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-01_08_14_24-4537209592619833224?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8h 54m 43s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/y64bqcn4axjn4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_Combine_Dataflow_Streaming #1026

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1026/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1025

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1025/display/redirect>

Changes:


------------------------------------------
[...truncated 29.08 KB...]
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.11.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.2-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.78.2-py3-none-any.whl (416 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.153 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.153-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.1-py2.py3-none-any.whl (224 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201909 sha256=c34d32a9124e89e10e4fdd8731f10e8a9581aea89f93bef215782fa319c1c7cb
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.1 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.153 botocore-1.29.153 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.20.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.1 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.1 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.78.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.2 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0614125356.1686755268.165640/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0614125356.1686755268.165640/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0614125356.1686755268.165640/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0614125356.1686755268.165640/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230614150748166753-8349'
 createTime: '2023-06-14T15:07:49.185557Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-14_08_07_48-12832743795784628884'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0614125356'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-14T15:07:49.185557Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-14_08_07_48-12832743795784628884]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-14_08_07_48-12832743795784628884
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-14_08_07_48-12832743795784628884?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-14_08_07_48-12832743795784628884 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:57.410Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.497Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.533Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.601Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.659Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.700Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.747Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.797Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.829Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.868Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.888Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.916Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.942Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:58.975Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.006Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.040Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.074Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.105Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.137Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.168Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.203Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.284Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.314Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.354Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.424Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.465Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.665Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.702Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:07:59.725Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-14_08_07_48-12832743795784628884 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:08:23.802Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:08:40.125Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:08:40.159Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:08:49.911Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:09:12.601Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:09:23.430Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:26:56.223Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T15:57:23.925Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T16:21:34.995Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T16:46:30.026Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T17:05:27.475Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T17:30:28.660Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T17:56:33.219Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T18:20:35.787Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T18:45:33.348Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T19:10:35.134Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T19:35:46.268Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-14_08_07_48-12832743795784628884 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.620Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-14_08_07_48-12832743795784628884.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.685Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.738Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.762Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.898Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-14T20:00:46.913Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-14_08_07_48-12832743795784628884?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 10s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/davxaab3q32nc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1024

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1024/display/redirect>

Changes:


------------------------------------------
[...truncated 29.12 KB...]
  Using cached google_cloud_pubsublite-1.7.0-py2.py3-none-any.whl (273 kB)
Collecting google-cloud-bigquery<4,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery-3.11.0-py2.py3-none-any.whl (219 kB)
Collecting google-cloud-bigquery-storage<3,>=2.6.3 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery_storage-2.20.0-py2.py3-none-any.whl (190 kB)
Collecting google-cloud-core<3,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<2.18.0,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.36.0-py2.py3-none-any.whl (332 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.2-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.152 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.152-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.1-py2.py3-none-any.whl (224 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201909 sha256=6e28f4a228c3cc30c003126a4bc58ad1c7641e19d399a067a9663ad72ef36af9
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.152 botocore-1.29.152 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.1 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.78.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.2 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0613125350.1686668861.047711/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0613125350.1686668861.047711/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0613125350.1686668861.047711/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0613125350.1686668861.047711/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230613150741048835-9768'
 createTime: '2023-06-13T15:07:43.892764Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-13_08_07_42-16500879349966031697'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0613125350'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-13T15:07:43.892764Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-13_08_07_42-16500879349966031697]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-13_08_07_42-16500879349966031697
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-13_08_07_42-16500879349966031697?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-13_08_07_42-16500879349966031697 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:01.060Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:07.660Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.306Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.379Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.459Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.491Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.549Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.605Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.650Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.690Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.725Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.759Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.786Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.823Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.853Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.886Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.920Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:10.998Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.030Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.061Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.098Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.210Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.245Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.277Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.316Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.355Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.547Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.591Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:11.655Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:12.125Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-13_08_07_42-16500879349966031697 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:56.744Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:08:56.776Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:09:06.628Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:09:28.787Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:09:39.335Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:22:54.539Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T15:54:11.391Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T16:20:12.155Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T16:43:13.674Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T17:09:25.111Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T17:35:16.484Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T17:55:28.793Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T18:14:20.272Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T18:39:22.370Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T18:58:23.316Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T19:23:24.903Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T19:49:26.483Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-13_08_07_42-16500879349966031697 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.758Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-13_08_07_42-16500879349966031697.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.807Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.868Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.889Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.910Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-13T20:00:52.933Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-13_08_07_42-16500879349966031697?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 15s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/x2hzbqgo5frya

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1023

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1023/display/redirect?page=changes>

Changes:

[noreply] Adding error tags in BigQuery Write Transforms (#27020)


------------------------------------------
[...truncated 29.19 KB...]
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.78.1-py3-none-any.whl (416 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.151 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.151-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201909 sha256=e90e236b6deb960228fd47beed149069ff90a2c3d585d9543a7208eddba03138
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.151 botocore-1.29.151 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.78.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.2 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0612125353.1686582475.376874/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0612125353.1686582475.376874/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0612125353.1686582475.376874/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0612125353.1686582475.376874/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230612150755377841-2334'
 createTime: '2023-06-12T15:07:56.501013Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-12_08_07_56-5142297964075332076'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0612125353'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-12T15:07:56.501013Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-12_08_07_56-5142297964075332076]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-12_08_07_56-5142297964075332076
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-12_08_07_56-5142297964075332076?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-12_08_07_56-5142297964075332076 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:07:59.754Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.763Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.785Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.848Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.908Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.936Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:00.988Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.054Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.092Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.118Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.147Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.181Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.219Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.251Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.273Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.302Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.326Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.372Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.395Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.416Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.500Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.529Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.553Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-12_08_07_56-5142297964075332076 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.598Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.636Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.806Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.838Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:01.893Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:08:33.913Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:09:41.127Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:09:41.154Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:09:56.283Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:10:03.660Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:10:58.581Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:10:58.609Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:11:18.860Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:34:18.861Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T15:39:29.428Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T16:03:27.090Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T16:28:32.651Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T16:53:39.826Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T17:19:40.941Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T17:39:32.132Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T18:04:33.286Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T18:29:34.185Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T18:55:35.318Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T19:21:36.542Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-12_08_07_56-5142297964075332076 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:07.935Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-12_08_07_56-5142297964075332076.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:07.976Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:08.024Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:08.045Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:08.073Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-12T20:01:08.097Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-12_08_07_56-5142297964075332076?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 8s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/gsxvqjtcjp5vq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1022

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1022/display/redirect?page=changes>

Changes:

[noreply] Add required commands to allowlist_externals in tox.ini (#27089)


------------------------------------------
[...truncated 129.48 KB...]
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:27:58.057Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:28:28.809Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:28:59.320Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:29:29.932Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:30:00.667Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:30:31.390Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:31:02.141Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:31:27.663Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:31:59.116Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:32:29.654Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:33:00.284Z: JOB_MESSAGE_ERROR: generic::unknown: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 295, in _execute
    response = task()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 370, in <lambda>
    lambda: self.create_****().do_instruction(request), request)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 630, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 661, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/sdk_****.py", line 496, in get
    self.data_sampler)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 877, in __init__
    _verify_descriptor_created_in_a_compatible_env(process_bundle_descriptor)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/runners/****/bundle_processor.py", line 841, in _verify_descriptor_created_in_a_compatible_env
    "Pipeline construction environment and pipeline runtime "
RuntimeError: Pipeline construction environment and pipeline runtime environment are not compatible. If you use a custom container image, check that the Python interpreter minor version and the Apache Beam version in your image match the versions used at pipeline construction time. Submission environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.49.0.dev. Runtime environment: beam:version:sdk_base:apache/beam_python3.7_sdk:2.48.0.dev.

INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-11T19:39:10.618Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-11_08_07_44-17811166637571296847 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-11_08_07_44-17811166637571296847?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/h5cuf2bvfk3dc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1021

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1021/display/redirect>

Changes:


------------------------------------------
[...truncated 28.98 KB...]
Collecting google-cloud-core<3,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<2.18.0,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.36.0-py2.py3-none-any.whl (332 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.2-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Collecting boto3<2,>=1.9 (from apache-beam==2.49.0.dev0)
  Using cached boto3-1.26.151-py3-none-any.whl (135 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.151 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.151-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201909 sha256=544ebcd4c7bdff7a23ef45b4fc4ea8c0cdd095c0d9fd9f92ca9dd99ad93e44ae
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.151 botocore-1.29.151 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.77.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0610125353.1686409650.670823/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0610125353.1686409650.670823/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0610125353.1686409650.670823/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0610125353.1686409650.670823/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230610150730672162-7951'
 createTime: '2023-06-10T15:07:31.897805Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-10_08_07_31-1241553607541181454'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0610125353'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-10T15:07:31.897805Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-10_08_07_31-1241553607541181454]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-10_08_07_31-1241553607541181454
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-10_08_07_31-1241553607541181454?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-10_08_07_31-1241553607541181454 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:35.323Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.715Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.832Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.896Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.965Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:41.993Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.057Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.102Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.130Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.159Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.176Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.197Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.228Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.261Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.293Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.325Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.357Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.390Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.421Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.454Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.488Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.578Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.604Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.636Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.666Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.698Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.879Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.915Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:42.955Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:07:46.284Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-10_08_07_31-1241553607541181454 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:08:22.508Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:08:54.941Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:09:05.175Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:11:07.390Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T15:41:35.317Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T16:06:37.822Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T16:31:48.840Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T16:56:38.749Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T17:21:39.796Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T18:04:41.206Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T18:24:52.234Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T18:50:43.297Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T19:15:44.433Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T19:39:45.299Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-10_08_07_31-1241553607541181454 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.800Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-10_08_07_31-1241553607541181454.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.859Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.921Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.938Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.969Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-10T20:00:51.992Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-10_08_07_31-1241553607541181454?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 29s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/qedwmwadmwdw2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1020

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1020/display/redirect>

Changes:


------------------------------------------
[...truncated 29.12 KB...]
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.77.0-py3-none-any.whl (416 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Collecting six>=1.11.0 (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0)
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.150 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.150-py3-none-any.whl (10.9 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.6.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.3-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3201884 sha256=a3ad9c824bc5c6dd87b26d1eaa62ab44e50528f5c8ef834aa96a3fdc1e8a99ba
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, pydot, pyasn1-modules, isodate, httplib2, grpcio-status, google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, azure-core, testcontainers, s3transfer, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.150 botocore-1.29.150 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.77.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.5.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0609130237.1686323325.906078/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0609130237.1686323325.906078/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0609130237.1686323325.906078/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0609130237.1686323325.906078/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230609150845907209-9873'
 createTime: '2023-06-09T15:08:47.315547Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-09_08_08_46-1785475662426921351'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0609130237'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-09T15:08:47.315547Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-09_08_08_46-1785475662426921351]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-09_08_08_46-1785475662426921351
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-09_08_08_46-1785475662426921351?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-09_08_08_46-1785475662426921351 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:08:57.725Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:03.994Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:09.002Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.109Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.169Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.189Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.247Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.303Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.343Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.374Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.400Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.429Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.460Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.492Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.523Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.568Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.599Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.630Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.662Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.695Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.797Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.823Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.851Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.880Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:12.898Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:13.058Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:13.086Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:13.135Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:15.499Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-09_08_08_46-1785475662426921351 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:09:56.693Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:10:26.149Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:10:33.699Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:15:31.245Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T15:39:36.605Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T16:05:37.926Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T16:25:41.272Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T16:51:41.030Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T17:17:52.051Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T17:36:42.941Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T17:55:53.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T18:20:44.690Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T18:53:45.867Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T19:18:47.475Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T19:43:48.566Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:08:59.901Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-09_08_08_46-1785475662426921351 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.119Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-09_08_08_46-1785475662426921351.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.158Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.202Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.229Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.249Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-09T20:32:34.274Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-09_08_08_46-1785475662426921351?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 26m 9s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/5qmptm54mhxua

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1019

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1019/display/redirect>

Changes:


------------------------------------------
[...truncated 32.34 KB...]
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.11.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.76.0-py3-none-any.whl (414 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.149 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.149-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3202652 sha256=5d8295a4cdc03aaa4ae9f22a436b7504f45fe952b7a7032b1d32f858eeb5f69d
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.149 botocore-1.29.149 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.76.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0608131119.1686236896.884726/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0608131119.1686236896.884726/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0608131119.1686236896.884726/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0608131119.1686236896.884726/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230608150816885778-5417'
 createTime: '2023-06-08T15:08:17.975826Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-08_08_08_17-12562312136469905881'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0608131119'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-08T15:08:17.975826Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-08_08_08_17-12562312136469905881]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-08_08_08_17-12562312136469905881
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-08_08_08_17-12562312136469905881?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-08_08_08_17-12562312136469905881 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:22.938Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.739Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.771Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.834Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.893Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.940Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:27.987Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.044Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.084Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.111Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.146Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.178Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.201Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.234Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.265Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.300Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.333Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.354Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.385Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.409Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.431Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.509Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.537Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.561Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.596Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:28.620Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:29.100Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:29.128Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:29.158Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-08_08_08_17-12562312136469905881 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:08:54.924Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:09:23.886Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:09:23.908Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:09:33.725Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:09:57.719Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:10:08.421Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:20:06.823Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T15:50:37.860Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T16:33:43.389Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T16:56:40.586Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T17:21:41.975Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T17:47:42.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T18:12:43.693Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T18:39:44.993Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T19:04:46.347Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T19:29:57.566Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T19:54:49.238Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-08_08_08_17-12562312136469905881 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.084Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-08_08_08_17-12562312136469905881.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.105Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.155Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.181Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.205Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-08T20:00:53.227Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1558, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-08_08_08_17-12562312136469905881?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 8s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/fqx32dbltplh4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1018

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1018/display/redirect>

Changes:


------------------------------------------
[...truncated 32.33 KB...]
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.76.0-py3-none-any.whl (414 kB)
Collecting boto3<2,>=1.9 (from apache-beam==2.49.0.dev0)
  Using cached boto3-1.26.148-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.148 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.148-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3199731 sha256=39032d395e20967c971d1d21f9d3237770036bf94d8351711e598b5e88f51b75
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.148 botocore-1.29.148 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.76.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0607125432.1686150469.748015/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0607125432.1686150469.748015/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0607125432.1686150469.748015/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0607125432.1686150469.748015/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230607150749749050-6760'
 createTime: '2023-06-07T15:07:50.816354Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-07_08_07_50-4886773571697503226'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0607125432'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-07T15:07:50.816354Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-07_08_07_50-4886773571697503226]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-07_08_07_50-4886773571697503226
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-07_08_07_50-4886773571697503226?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-07_08_07_50-4886773571697503226 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:02.805Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:08.968Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:08.995Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.055Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.125Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.165Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.229Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.285Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.313Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.341Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.364Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.386Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.411Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.447Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.472Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.496Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.523Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.550Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.584Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.608Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.641Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.757Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.788Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.810Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.838Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:09.863Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:10.019Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:10.056Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:10.090Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-07_08_07_50-4886773571697503226 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:08:32.615Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:09:02.663Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:09:35.103Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:09:45.387Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:10:00.629Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T15:41:20.262Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T16:06:17.427Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T16:31:19.078Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T16:56:21.067Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T17:21:22.125Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T17:47:23.279Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T18:13:44.883Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T18:38:26.072Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T19:03:26.671Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T19:29:27.874Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T19:54:29.367Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-07_08_07_50-4886773571697503226 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.741Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-07_08_07_50-4886773571697503226.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.766Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.827Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.852Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.874Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-07T20:00:56.896Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-07_08_07_50-4886773571697503226?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 20s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/2pkbvtv67jvhm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1017

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1017/display/redirect>

Changes:


------------------------------------------
[...truncated 32.53 KB...]
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.76.0-py3-none-any.whl (414 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.147 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.147-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3199731 sha256=8ea00cca39ed6fb00a6c771d17068be7a6edcbcc5b6771776b7ddad4699c2bbc
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.147 botocore-1.29.147 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.76.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0606131707.1686064903.630362/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0606131707.1686064903.630362/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0606131707.1686064903.630362/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0606131707.1686064903.630362/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230606152143632368-9794'
 createTime: '2023-06-06T15:21:46.791860Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-06_08_21_46-16943315462094997206'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0606131707'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-06T15:21:46.791860Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-06_08_21_46-16943315462094997206]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-06_08_21_46-16943315462094997206
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-06_08_21_46-16943315462094997206?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-06_08_21_46-16943315462094997206 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:49.561Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:50.865Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:50.883Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:50.954Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.013Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.043Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.111Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.180Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.234Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.264Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.291Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.319Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.342Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.380Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.411Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.443Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.477Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.510Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.537Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.601Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.698Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.721Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.750Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.787Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.819Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-06_08_21_46-16943315462094997206 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:51.993Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:52.016Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:21:52.065Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:22:13.674Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:22:35.687Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:23:10.828Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:23:21.329Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T15:45:01.346Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T16:15:14.980Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T16:40:21.434Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T17:05:19.293Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T17:24:20.206Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T17:43:31.897Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:02:23.470Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:22:24.670Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T18:42:15.829Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:01:27.233Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:35:29.995Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T19:54:30.692Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-06_08_21_46-16943315462094997206 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.083Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-06_08_21_46-16943315462094997206.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.111Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.154Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.179Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.208Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-06T20:00:51.224Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-06_08_21_46-16943315462094997206?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 43m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/kiemsgqsjo2l4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1016

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1016/display/redirect?page=changes>

Changes:

[noreply] Removing an unnecessary dependency (#27001)


------------------------------------------
[...truncated 32.75 KB...]
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.2-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Collecting azure-storage-blob<13,>=12.3.2 (from apache-beam==2.49.0.dev0)
  Using cached azure_storage_blob-12.16.0-py3-none-any.whl (387 kB)
Collecting azure-core<2,>=1.7.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_core-1.27.0-py3-none-any.whl (174 kB)
Collecting azure-identity<2,>=1.12.0 (from apache-beam==2.49.0.dev0)
  Using cached azure_identity-1.13.0-py3-none-any.whl (151 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.146 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.146-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3088274 sha256=a2f6f355b217e9c5b044b2315991a05d77f09a08f1c4163e5d7c35d1332582e1
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.146 botocore-1.29.146 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.76.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0605125355.1685977697.088578/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0605125355.1685977697.088578/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0605125355.1685977697.088578/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0605125355.1685977697.088578/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230605150817089670-9495'
 createTime: '2023-06-05T15:08:18.313892Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-05_08_08_17-8750074020061385336'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0605125355'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-05T15:08:18.313892Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-05_08_08_17-8750074020061385336]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-05_08_08_17-8750074020061385336
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-05_08_08_17-8750074020061385336?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-05_08_08_17-8750074020061385336 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:30.897Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.716Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.751Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.824Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.922Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:33.961Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.031Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.104Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.167Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.222Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.258Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.282Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.315Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.459Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.541Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.642Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.706Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.783Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.857Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.888Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:34.930Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.134Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.205Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.249Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.296Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.353Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.588Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.661Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:35.772Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-05_08_08_17-8750074020061385336 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:08:39.377Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:09:19.928Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:10:10.205Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:10:20.262Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:11:29.842Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:16:12.901Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:16:41.095Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T15:40:35.712Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:05:37.212Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:29:43.667Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:42:17.326Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:42:28.827Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T16:49:39.580Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T17:08:41.423Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T17:27:43.168Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T17:46:43.817Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T18:06:55.312Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T18:26:47.045Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T18:46:47.939Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T19:05:48.770Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T19:25:50.362Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T19:45:51.293Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:10:52.960Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.522Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-05_08_08_17-8750074020061385336.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.570Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.652Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.678Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.712Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-05T20:21:52.736Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-05_08_08_17-8750074020061385336 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-05_08_08_17-8750074020061385336?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 16m 14s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/sbqrcbnfp545g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1015

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1015/display/redirect>

Changes:


------------------------------------------
[...truncated 32.31 KB...]
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.75.9-py3-none-any.whl (414 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.146 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.146-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3088274 sha256=ba997a4e1ff71ae6cbb6a1e639f5013e0fa5d921874292fae1cb1ce61ae65b21
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.146 botocore-1.29.146 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.9 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0604125354.1685891271.678458/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0604125354.1685891271.678458/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0604125354.1685891271.678458/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0604125354.1685891271.678458/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230604150751680141-6082'
 createTime: '2023-06-04T15:07:53.062987Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-04_08_07_52-17522196373556568990'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0604125354'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-04T15:07:53.062987Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-04_08_07_52-17522196373556568990]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-04_08_07_52-17522196373556568990
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-04_08_07_52-17522196373556568990?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-04_08_07_52-17522196373556568990 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:57.610Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.640Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.669Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.718Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.783Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.798Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.852Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.894Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.939Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.972Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:58.994Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.025Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.057Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.091Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.125Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.147Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.179Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.256Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.280Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.311Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.397Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.444Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.476Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.500Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.521Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.713Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.752Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:07:59.780Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-04_08_07_52-17522196373556568990 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:08:31.588Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:08:48.571Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:09:24.885Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:09:25.782Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:09:35.002Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T15:41:52.599Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T16:03:54.426Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T16:28:55.672Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T16:53:56.795Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T17:20:57.843Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T17:45:59.136Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T18:11:00.834Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T18:36:03.034Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T18:39:25.785Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T19:08:05.676Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T19:33:07.184Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T19:57:08.535Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-04_08_07_52-17522196373556568990 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.781Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-04_08_07_52-17522196373556568990.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.815Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.880Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.904Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.929Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-04T20:00:37.950Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-04_08_07_52-17522196373556568990?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/s62cdvvlbcsje

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1014

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1014/display/redirect>

Changes:


------------------------------------------
[...truncated 32.23 KB...]
  Using cached google_cloud_pubsub-2.17.1-py2.py3-none-any.whl (265 kB)
Collecting google-cloud-pubsublite<2,>=1.2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_pubsublite-1.7.0-py2.py3-none-any.whl (273 kB)
Collecting google-cloud-bigquery<4,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery-3.11.0-py2.py3-none-any.whl (219 kB)
Collecting google-cloud-bigquery-storage<3,>=2.6.3 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigquery_storage-2.20.0-py2.py3-none-any.whl (190 kB)
Collecting google-cloud-core<3,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_core-2.3.2-py2.py3-none-any.whl (29 kB)
Collecting google-cloud-bigtable<2.18.0,>=2.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_bigtable-2.17.0-py2.py3-none-any.whl (288 kB)
Collecting google-cloud-spanner<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_spanner-3.35.1-py2.py3-none-any.whl (332 kB)
Collecting google-cloud-dlp<4,>=3.0.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_dlp-3.12.1-py2.py3-none-any.whl (143 kB)
Collecting google-cloud-language<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_language-2.10.0-py2.py3-none-any.whl (101 kB)
Collecting google-cloud-videointelligence<3,>=2.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_videointelligence-2.11.2-py2.py3-none-any.whl (229 kB)
Collecting google-cloud-vision<4,>=2 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_vision-3.4.2-py2.py3-none-any.whl (444 kB)
Collecting google-cloud-recommendations-ai<0.11.0,>=0.1.0 (from apache-beam==2.49.0.dev0)
  Using cached google_cloud_recommendations_ai-0.10.3-py2.py3-none-any.whl (173 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.146 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.146-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3088274 sha256=c0508a2041b0add787568225bafa42e0853e624c9105cb222cb8972ed1974861
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.146 botocore-1.29.146 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.9 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0603125346.1685804872.311847/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0603125346.1685804872.311847/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0603125346.1685804872.311847/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0603125346.1685804872.311847/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230603150752312864-9714'
 createTime: '2023-06-03T15:07:53.534900Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-03_08_07_52-12972041187723029300'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0603125346'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-03T15:07:53.534900Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-03_08_07_52-12972041187723029300]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-03_08_07_52-12972041187723029300
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-03_08_07_52-12972041187723029300?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-03_08_07_52-12972041187723029300 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:07:58.769Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.285Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.312Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.360Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.417Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.441Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.504Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.553Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.587Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.613Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.641Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.710Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.730Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.763Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.799Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.823Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.906Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.940Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.972Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:00.997Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.095Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.137Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.158Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.179Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.206Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.384Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.413Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:01.441Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-03_08_07_52-12972041187723029300 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:25.684Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:08:41.767Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:09:16.792Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:09:27.440Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:15:36.081Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T15:44:55.552Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T16:11:07.115Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T16:36:58.406Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T17:02:01.058Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T17:24:02.951Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T17:49:04.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T18:16:15.641Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T18:41:07.200Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T19:06:07.966Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T19:32:09.562Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T19:57:10.629Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.058Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-03_08_07_52-12972041187723029300.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.086Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.148Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.172Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.188Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-03T20:00:56.207Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-03_08_07_52-12972041187723029300 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-03_08_07_52-12972041187723029300?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 30s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/vzzzggkvismtw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #1013

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/1013/display/redirect>

Changes:


------------------------------------------
[...truncated 32.20 KB...]
Collecting pyyaml<7.0.0,>=3.12 (from apache-beam==2.49.0.dev0)
  Using cached PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (596 kB)
Collecting requests_mock<2.0,>=1.7 (from apache-beam==2.49.0.dev0)
  Using cached requests_mock-1.10.0-py2.py3-none-any.whl (28 kB)
Collecting tenacity<9,>=8.0.0 (from apache-beam==2.49.0.dev0)
  Using cached tenacity-8.2.2-py3-none-any.whl (24 kB)
Collecting pytest<8.0,>=7.1.2 (from apache-beam==2.49.0.dev0)
  Using cached pytest-7.3.1-py3-none-any.whl (320 kB)
Collecting pytest-xdist<4,>=2.5.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Collecting pytest-timeout<3,>=2.1.0 (from apache-beam==2.49.0.dev0)
  Using cached pytest_timeout-2.1.0-py3-none-any.whl (12 kB)
Collecting scikit-learn>=0.20.0 (from apache-beam==2.49.0.dev0)
  Using cached scikit_learn-1.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (24.8 MB)
Collecting sqlalchemy<2.0,>=1.3 (from apache-beam==2.49.0.dev0)
  Using cached SQLAlchemy-1.4.48-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5 (from apache-beam==2.49.0.dev0)
  Using cached psycopg2_binary-2.9.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting testcontainers[mysql]<4.0.0,>=3.0.3 (from apache-beam==2.49.0.dev0)
  Using cached testcontainers-3.7.1-py2.py3-none-any.whl (45 kB)
Collecting cryptography>=36.0.0 (from apache-beam==2.49.0.dev0)
  Using cached cryptography-41.0.1-cp37-abi3-manylinux_2_28_x86_64.whl (4.3 MB)
Collecting hypothesis<=7.0.0,>5.0.0 (from apache-beam==2.49.0.dev0)
  Using cached hypothesis-6.75.9-py3-none-any.whl (414 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core<2,>=1.7.0->apache-beam==2.49.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.20.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal-1.22.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0 (from azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting isodate>=0.6.1 (from azure-storage-blob<13,>=12.3.2->apache-beam==2.49.0.dev0)
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting botocore<1.30.0,>=1.29.145 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached botocore-1.29.145-py3-none-any.whl (10.8 MB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.9->apache-beam==2.49.0.dev0)
  Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting urllib3<2.0 (from google-auth<3,>=1.18.0->apache-beam==2.49.0.dev0)
  Using cached urllib3-1.26.16-py2.py3-none-any.whl (143 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from google-cloud-bigtable<2.18.0,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from google-cloud-pubsub<3,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached grpcio_status-1.54.2-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.49.0.dev0)
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.4.4 (from google-cloud-spanner<4,>=3.0.0->apache-beam==2.49.0.dev0)
  Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.49.0.dev0)
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.5.2-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3080678 sha256=1f82aadbd8d9986f596c792cd10232e0b6595f7a068c4443f703d111fb1028d6
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 azure-core-1.27.0 azure-identity-1.13.0 azure-storage-blob-12.16.0 boto3-1.26.145 botocore-1.29.145 cachetools-5.3.1 certifi-2023.5.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.19.1 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.0 google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.35.1 google-cloud-videointelligence-2.11.2 google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.9 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.0 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2023.5.5 requests-2.31.0 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.16 websocket-client-1.5.2 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.49.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0602125348.1685718470.573914/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0602125348.1685718470.573914/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0602125348.1685718470.573914/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0602125348.1685718470.573914/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230602150750575017-4669'
 createTime: '2023-06-02T15:07:51.794282Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-02_08_07_51-2028071261042916753'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0602125348'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-02T15:07:51.794282Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-06-02_08_07_51-2028071261042916753]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-06-02_08_07_51-2028071261042916753
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-02_08_07_51-2028071261042916753?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-02_08_07_51-2028071261042916753 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:56.751Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.064Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.099Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.179Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.246Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.269Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.332Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.402Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.435Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.472Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.504Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.533Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.568Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.598Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.623Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.648Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.679Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.701Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.724Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.758Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.790Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.899Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.935Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.965Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:58.997Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:59.031Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:59.209Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:59.245Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:07:59.287Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-02_08_07_51-2028071261042916753 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:08:16.420Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:08:29.081Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:08:39.962Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:09:29.981Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:09:40.211Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T15:38:29.081Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T16:49:56.509Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T17:15:01.340Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T17:39:58.517Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T18:04:59.796Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T18:30:14.606Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T18:55:13.512Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T19:20:05.868Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T19:45:07.790Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-06-02_08_07_51-2028071261042916753 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.789Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-06-02_08_07_51-2028071261042916753.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.831Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.948Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.967Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:08.990Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-02T20:01:09.012Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1557, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-02_08_07_51-2028071261042916753?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 18s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/wtv7wtfupepxa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org