You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2023/01/29 20:01:41 UTC

Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #888

See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/888/display/redirect>

Changes:


------------------------------------------
[...truncated 26.67 KB...]
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.59
  Using cached botocore-1.29.59-py3-none-any.whl (10.4 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.0-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2985565 sha256=91f6c220c18951bdc5c6b48c9c94d465d4766c25bab8f88122e5ed4a65b9d92f
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.2 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.59 botocore-1.29.59 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.4.2 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.0 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.65.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.20.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.0 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0129150213.1675004868.259467/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0129150213.1675004868.259467/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0129150213.1675004868.259467/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0129150213.1675004868.259467/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230129150748260425-9426'
 createTime: '2023-01-29T15:07:49.409068Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-01-29_07_07_48-9178972955325958636'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0129150213'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-01-29T15:07:49.409068Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-01-29_07_07_48-9178972955325958636]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-01-29_07_07_48-9178972955325958636
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-01-29_07_07_48-9178972955325958636?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-01-29_07_07_48-9178972955325958636 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:57.684Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:58.954Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:58.998Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.072Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.208Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.250Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.347Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.417Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.472Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.542Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.574Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.607Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.646Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.685Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.723Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.753Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.786Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.817Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.851Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.886Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:07:59.920Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:00.045Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:00.090Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:00.123Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:00.161Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:00.197Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:01.325Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:01.360Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:01.396Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-01-29_07_07_48-9178972955325958636 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:07.459Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:36.119Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:36.139Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:08:45.606Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:09:15.200Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:09:27.803Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:51:06.133Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:54:01.548Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T15:54:56.118Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T16:24:06.775Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T16:25:08.345Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T16:39:09.236Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T16:56:11.683Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T16:58:13.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T17:06:15.252Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T17:15:26.525Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T17:29:18.104Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T17:36:19.622Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T17:43:21.249Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T17:50:25.869Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T18:01:47.277Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T18:02:28.585Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T18:07:32.467Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T18:24:24.572Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T18:34:35.870Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T18:58:35.307Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T19:01:40.543Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T19:33:42.611Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T19:42:44.140Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-01-29_07_07_48-9178972955325958636 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T20:00:57.448Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-01-29_07_07_48-9178972955325958636.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T20:00:57.486Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T20:00:57.534Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T20:00:57.556Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T20:00:57.584Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-29T20:00:57.600Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-01-29_07_07_48-9178972955325958636?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 12s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dzlwptqvoqi6k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_LoadTests_Python_Combine_Dataflow_Streaming #908

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/908/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #907

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/907/display/redirect?page=changes>

Changes:

[noreply] Fix whitespace check (#25514)

[noreply] Add back links removed in #24744 (#25513)

[bvolpato] Fix pull_licenses_java returning 404 from opensource.org

[noreply] [#24789][prism] internal/jobservices metrics + tests (#25497)

[noreply] [Go SDK]: Allow SDF methods to have context param and error return value

[noreply] Move closing milestone out of PMC-only tasks (#25516)

[noreply] update GCP cloud libraries BOM to 26.8.0 (#25470)

[noreply] Fix interface{} in iter& emit type of DoFn in Go (#25203)

[noreply] Task #25064: Python SDK Data sampling implementation (#25093)

[noreply] Add support for all Java based portable runners to consume elements

[noreply] Swap Java SDK container to use eclipse-temurin as the base instead of

[noreply] Update BigQueryIO documentation with details on how to override the

[noreply] Support Avro GenericRecord as a valid format for StorageWrite API on

[elizaveta.lomteva] complete examples links fixed


------------------------------------------
[...truncated 26.65 KB...]
  Using cached azure_core-1.26.3-py3-none-any.whl (174 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.46.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting botocore<1.30.0,>=1.29.73
  Using cached botocore-1.29.73-py3-none-any.whl (10.4 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting pytest-forked
  Using cached pytest_forked-1.6.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.13.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=3029875 sha256=984f0439cf010618fb47052fd8910f796ee57d674865dc5d2bc937a106800c48
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.73 botocore-1.29.73 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.68.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.6 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.6.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0217153132.1676650218.082096/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0217153132.1676650218.082096/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0217153132.1676650218.082096/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0217153132.1676650218.082096/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230217161018083043-6484'
 createTime: '2023-02-17T16:10:19.174953Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-17_08_10_18-4318114438587344678'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0217153132'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-17T16:10:19.174953Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-17_08_10_18-4318114438587344678]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-17_08_10_18-4318114438587344678
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-17_08_10_18-4318114438587344678?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-17_08_10_18-4318114438587344678 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:52.583Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.343Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.378Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.439Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.490Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.529Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.593Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.658Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.689Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.723Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.754Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.786Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.819Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.851Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.884Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.915Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.948Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:54.981Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:55.014Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:55.047Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:55.075Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:55.184Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:55.219Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:55.240Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:55.274Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:55.301Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:56.385Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:56.418Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:10:56.457Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-17_08_10_18-4318114438587344678 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:11:17.793Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:11:36.786Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:11:36.825Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:11:46.580Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:12:15.213Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:12:25.528Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:53:54.020Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:55:52.615Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T16:57:54.072Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T17:26:05.408Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T17:26:56.256Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T17:38:57.322Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T18:00:58.692Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T18:07:00.360Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T18:22:06.394Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T18:22:08.612Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T18:34:26.693Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T18:35:27.608Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T19:10:29.016Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T19:12:30.140Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T19:42:41.382Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T19:49:32.888Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T20:15:23.545Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-17_08_10_18-4318114438587344678.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-17_08_10_18-4318114438587344678 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T20:15:23.595Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T20:15:23.677Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T20:15:23.699Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T20:15:23.787Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-17T20:15:23.824Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-17_08_10_18-4318114438587344678?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 8m 6s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/sgen6k2pmwen4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #906

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/906/display/redirect?page=changes>

Changes:

[noreply] SpannerIO: parameterizing partitionQuery timeout (#25236)

[noreply] Use example id as CloudPath (#25487)

[noreply] SpannerIO: Handling pg spanner.commit_timestamp (#25479)

[noreply] Fix provider to be found by AutoService (#25491)

[noreply] [Python] Added Tensorflow Model Handler  (#25368)

[noreply] [#24789][prism] internal/coders.go and tests (#25476)

[noreply] Update documents for 2.45 (#25407)

[noreply] Update documents for 2.45 (#25500)

[noreply] Update python container images (#25475)

[noreply] Fix beam.Row.__eq__ for rows with trailing columns (#23876)

[noreply] TFMA notebook showing ExtractEvaluateAndWriteResult Tranfsorm (#25381)


------------------------------------------
[...truncated 26.50 KB...]
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.46.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.72
  Using cached botocore-1.29.72-py3-none-any.whl (10.4 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.6.0-py3-none-any.whl (4.9 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.13.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=3022742 sha256=9209de4a29a96f7dffb713a3391ae156f0051741bcaafcf90f32f8252c15d7d1
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.72 botocore-1.29.72 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.1 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.68.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.6 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.6.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0216150318.1676560125.420870/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0216150318.1676560125.420870/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0216150318.1676560125.420870/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0216150318.1676560125.420870/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230216150845422486-5564'
 createTime: '2023-02-16T15:08:46.692944Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-16_07_08_46-3767038071377425053'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0216150318'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-16T15:08:46.692944Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-16_07_08_46-3767038071377425053]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-16_07_08_46-3767038071377425053
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-16_07_08_46-3767038071377425053?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-16_07_08_46-3767038071377425053 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:08:59.705Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.080Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.111Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.174Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.240Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.270Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.338Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.395Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.428Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.463Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.494Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.528Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.551Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.588Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.620Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.652Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.679Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.701Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.723Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.745Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.770Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.867Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.900Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.936Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:01.979Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:02.015Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-16_07_08_46-3767038071377425053 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:03.080Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:03.107Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:03.157Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:26.594Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:09:41.626Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:10:14.094Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:10:25.423Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:52:08.582Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:53:59.430Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T15:55:06.538Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T16:23:58.950Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T16:27:04.207Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T16:38:02.189Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T16:57:03.253Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T16:58:08.341Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T17:14:05.857Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T17:15:08.250Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T17:58:07.749Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T18:01:08.574Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T18:29:10.378Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T18:30:12.229Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T19:03:13.732Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T19:04:13.438Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T19:07:14.613Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T19:29:17.043Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T19:35:19.758Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T19:54:21.143Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-16_07_08_46-3767038071377425053 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T20:00:53.858Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-16_07_08_46-3767038071377425053.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T20:00:53.902Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T20:00:53.972Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T20:00:53.997Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T20:00:54.015Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-16T20:00:54.037Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-16_07_08_46-3767038071377425053?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 49s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/4w4iw3dycxik6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #905

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/905/display/redirect?page=changes>

Changes:

[noreply] Update grafana dashboard to handle name change for

[noreply] Test loading URLs (#25034)

[noreply] Fixing issue with ErrorCapture transform where pipeline issues are

[noreply] [24464] Finalize FileWriteSchemaTransformProvider (#25420)

[noreply] Use codecov-action@v3; v2 is no longer supported. (#25477)

[noreply] Update the title of the wordcount quickstart (#25471)

[noreply] [Go SDK] add retries to connect with expansion service (#25237)


------------------------------------------
[...truncated 26.95 KB...]
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.71
  Using cached botocore-1.29.71-py3-none-any.whl (10.4 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.6.0-py3-none-any.whl (4.9 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.13.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=3013053 sha256=5f96aacfca546006c11da40948c19d9ef40803ed1135b5c65f7501054aeff56f
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.71 botocore-1.29.71 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.1 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.68.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.6 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.6.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0215150208.1676473677.393096/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0215150208.1676473677.393096/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0215150208.1676473677.393096/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0215150208.1676473677.393096/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230215150757394082-3695'
 createTime: '2023-02-15T15:07:58.417398Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-15_07_07_57-10014613229984620104'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0215150208'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-15T15:07:58.417398Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-15_07_07_57-10014613229984620104]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-15_07_07_57-10014613229984620104
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-15_07_07_57-10014613229984620104?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-15_07_07_57-10014613229984620104 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:05.205Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.019Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.058Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.129Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.198Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.227Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.277Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.335Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.385Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.409Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.452Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.475Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.511Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.535Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.564Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.593Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.622Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.649Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.674Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.699Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.734Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.833Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.879Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.905Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.931Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:07.964Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:08.079Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:08.114Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:08.164Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-15_07_07_57-10014613229984620104 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:41.935Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:08:47.052Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:09:18.967Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:09:28.995Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:50:04.766Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:52:05.992Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T15:54:06.821Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T16:22:04.843Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T16:23:10.065Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T16:37:07.642Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T16:46:08.928Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T16:56:09.980Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T17:04:11.623Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T17:12:22.643Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T17:22:13.734Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T17:29:14.760Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T17:45:16.404Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T17:48:18.023Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T18:05:29.126Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T18:06:21.104Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T18:22:22.236Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T18:27:23.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T18:38:24.259Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T18:48:25.224Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T18:56:27.517Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T19:06:29.093Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T19:12:40.421Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T19:24:31.629Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T19:32:33.677Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T19:42:34.936Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T19:50:46.110Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T20:00:37.074Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-15_07_07_57-10014613229984620104 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T20:00:56.902Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-15_07_07_57-10014613229984620104.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T20:00:56.936Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T20:00:57.012Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T20:00:57.039Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T20:00:57.060Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-15T20:00:57.082Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-15_07_07_57-10014613229984620104?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 26s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/slg7ggo364lke

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #904

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/904/display/redirect?page=changes>

Changes:

[xqhu] Ignore flags for beam_sql magic

[xqhu] Raise the error when some functions are misused

[xqhu] change how to import

[noreply] Add WatchFilePattern  (#25393)

[noreply] Support gauge metrics in portable mode (#25396)

[noreply] Validate that GBK coders are always set correctly. (#25394)

[noreply] Fix tox error running hdfsIntegrationTest (#25446)

[noreply] Adding support for DLQ for ZetaSQL (#25426)

[noreply] Remove python 3.6 references (#25445)

[noreply] Fix pulling licenses (#25456)


------------------------------------------
[...truncated 27.47 KB...]
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.6.0-py3-none-any.whl (4.9 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.13.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=3013053 sha256=7f97bad70360ad4f10b9deccdc893654556599954efc72d481c3eaac24fa6af0
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.70 botocore-1.29.70 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.1 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.68.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.6 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.6.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0214150235.1676387292.587932/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0214150235.1676387292.587932/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0214150235.1676387292.587932/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0214150235.1676387292.587932/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230214150812588980-6820'
 createTime: '2023-02-14T15:08:13.693346Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-14_07_08_13-17476471195205100016'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0214150235'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-14T15:08:13.693346Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-14_07_08_13-17476471195205100016]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-14_07_08_13-17476471195205100016
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-14_07_08_13-17476471195205100016?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-14_07_08_13-17476471195205100016 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:19.010Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.316Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.373Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.446Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.541Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.584Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.648Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.716Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.770Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.810Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.853Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.885Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.922Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.954Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:20.990Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.034Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.073Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.106Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.140Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.188Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.225Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.335Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.380Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.426Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.463Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:21.505Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:22.600Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:22.641Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:22.670Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-14_07_08_13-17476471195205100016 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:08:32.917Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:09:01.572Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:09:33.245Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:09:45.260Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:50:49.877Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:51:51.361Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:53:59.599Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T15:54:54.235Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T16:22:15.174Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T16:35:52.467Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T16:36:57.255Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T16:38:58.298Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T16:56:56.102Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T17:08:01.254Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T17:17:58.523Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T17:18:59.564Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T17:30:01.307Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T17:33:02.304Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T17:50:03.512Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T17:57:04.356Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T18:01:05.779Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T18:10:07.256Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T18:23:08.391Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T18:26:09.469Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T18:30:10.937Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T18:44:11.766Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T18:53:12.561Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T18:54:13.686Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T19:05:14.502Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T19:18:16.242Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T19:23:17.046Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T19:26:18.057Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T19:38:19.107Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T19:44:20.220Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T19:57:21.011Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T20:03:22.234Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T20:10:23.644Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T20:11:24.505Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-14_07_08_13-17476471195205100016 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T20:18:33.322Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-14_07_08_13-17476471195205100016.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T20:18:33.350Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T20:18:33.385Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T20:18:33.406Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T20:18:33.437Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-14T20:18:33.458Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-14_07_08_13-17476471195205100016?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 12m 31s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/bfnawm3z765u4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #903

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/903/display/redirect?page=changes>

Changes:

[noreply] (#25316) Enable LZMA compression in Python SDK I/O (#25317)


------------------------------------------
[...truncated 27.08 KB...]
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.69
  Using cached botocore-1.29.69-py3-none-any.whl (10.4 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.6.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.13.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=3005849 sha256=ddb69815bb9b3aaaeafe7ed4c36708a449e5806425c42f9328450a542dd27b85
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.69 botocore-1.29.69 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.1 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.68.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.6 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.6.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0213150211.1676301385.814939/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0213150211.1676301385.814939/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0213150211.1676301385.814939/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0213150211.1676301385.814939/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230213151625815927-6427'
 createTime: '2023-02-13T15:16:26.929847Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-13_07_16_26-7822987591622234408'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0213150211'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-13T15:16:26.929847Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-13_07_16_26-7822987591622234408]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-13_07_16_26-7822987591622234408
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-13_07_16_26-7822987591622234408?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-13_07_16_26-7822987591622234408 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:35.501Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:41.931Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:42.879Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:42.948Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.025Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.056Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.123Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.189Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.236Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.257Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.289Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.323Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.347Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.379Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.410Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.441Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.474Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.508Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.539Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.599Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.719Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.754Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.784Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.818Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.874Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:43.983Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:44.019Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:16:44.052Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-13_07_16_26-7822987591622234408 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:17:01.673Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:17:23.381Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:17:23.419Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 2, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:17:33.144Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:17:59.036Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T15:18:10.232Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T16:01:40.706Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T16:02:43.012Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T16:03:44.264Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T16:35:41.937Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T16:36:47.527Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T17:10:44.921Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T17:11:46.288Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T17:17:47.912Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T17:45:49.736Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T17:46:51.116Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T18:12:52.492Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T18:18:53.135Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T18:23:54.384Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T18:42:56.351Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T19:00:57.429Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T19:18:58.402Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T19:28:59.635Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T19:53:01.944Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T19:56:14.189Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T20:26:06.532Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T20:35:08.717Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-13_07_16_26-7822987591622234408 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T20:51:09.047Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-13_07_16_26-7822987591622234408.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T20:51:09.092Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T20:51:09.142Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T20:51:09.187Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T20:51:09.208Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-13T20:51:09.236Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-13_07_16_26-7822987591622234408?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 37m 44s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/deidea3o25yp6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #902

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/902/display/redirect?page=changes>

Changes:

[noreply] Add support for loading torchscript models (#25321)

[noreply] [#24971] Add a retry policy for JmsIO #24971 (#24973)


------------------------------------------
[...truncated 26.39 KB...]
  Using cached azure_core-1.26.3-py3-none-any.whl (174 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.46.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.69
  Using cached botocore-1.29.69-py3-none-any.whl (10.4 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.13.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=3005716 sha256=73a8d640e8f4963c64a98f29ec66c0b279bb003499448d9d1230d5289c3e66f5
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.69 botocore-1.29.69 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.1 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.68.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.6 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0212150209.1676214473.144270/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0212150209.1676214473.144270/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0212150209.1676214473.144270/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0212150209.1676214473.144270/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230212150753145302-7750'
 createTime: '2023-02-12T15:07:54.233392Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-12_07_07_53-6784897962282323209'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0212150209'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-12T15:07:54.233392Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-12_07_07_53-6784897962282323209]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-12_07_07_53-6784897962282323209
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-12_07_07_53-6784897962282323209?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-12_07_07_53-6784897962282323209 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:07:58.596Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:07:59.831Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:07:59.870Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:07:59.925Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:07:59.992Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.028Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.085Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.156Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.199Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.226Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.247Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.278Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.306Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.336Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.367Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.395Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.426Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.449Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.472Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.507Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.538Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.638Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.676Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.704Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.738Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:00.771Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:01.843Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:01.880Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:01.909Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-12_07_07_53-6784897962282323209 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:12.805Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:08:47.416Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:09:17.225Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:09:28.900Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:52:59.398Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:54:13.259Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T15:55:12.717Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T16:27:13.449Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T16:30:20.687Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T16:58:15.055Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T17:17:17.459Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T17:34:13.277Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T17:53:14.460Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T18:11:25.241Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T18:30:28.434Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T18:47:30.929Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T19:04:32.179Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T19:07:26.341Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T19:23:35.528Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T19:25:31.009Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T19:39:41.820Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T19:54:34.254Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-12_07_07_53-6784897962282323209 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T20:01:08.940Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-12_07_07_53-6784897962282323209.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T20:01:08.969Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T20:01:09.071Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T20:01:09.093Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T20:01:09.114Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-12T20:01:09.138Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-12_07_07_53-6784897962282323209?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 13s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/lv45d26v2lvnw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #901

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/901/display/redirect?page=changes>

Changes:

[chamikaramj] Update PythonMap transform to accept extra packages

[chamikaramj] Update the test

[chamikaramj] Address reviewer comments

[chamikaramj] Copy environment capabilities when creating the WorkerPool for Java

[noreply] Added MetadataSpannerConfig class for generating SpannerConfig for

[noreply] Remove ValueProvider from BigtableIO ReadChangeStream (#25409)

[noreply] Annotate Cloud Bigtable implementation details as Internal (#25403)

[noreply] Add dependencies in some examples (#25425)

[noreply] Add batching args to ModelHandlers docs (#25398)

[noreply] Data sampling proto (#25421)

[noreply] Support ONNX runtime in RunInference API  (#24911)

[noreply] Fix UpdateSchemaDestination breaking DynamicDestination in Bigquery

[noreply] Fix whitespace (#25432)

[noreply] [#25417][go] copy env details to dataflow images (#25431)

[noreply] Update jupyter-client requirement from <6.1.13,>=6.1.11 to

[noreply] Add runtime metric to TFT tests (#25242)

[noreply] Fix typo tranform; workaround non-ascii char (#25428)


------------------------------------------
[...truncated 27.20 KB...]
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting botocore<1.30.0,>=1.29.69
  Using cached botocore-1.29.69-py3-none-any.whl (10.4 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.13.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=3004665 sha256=0b69ff725c88b5fa8fcfcde06080d2235b7997e06511a287419e610682bfe98f
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.69 botocore-1.29.69 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.1 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.68.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.6 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0211150213.1676128078.476762/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0211150213.1676128078.476762/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0211150213.1676128078.476762/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0211150213.1676128078.476762/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230211150758477742-1732'
 createTime: '2023-02-11T15:07:59.595211Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-11_07_07_59-5722825513126140965'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0211150213'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-11T15:07:59.595211Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-11_07_07_59-5722825513126140965]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-11_07_07_59-5722825513126140965
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-11_07_07_59-5722825513126140965?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-11_07_07_59-5722825513126140965 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:05.910Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.253Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.286Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.365Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.496Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.514Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.559Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.603Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.634Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.661Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.679Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.720Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.755Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.783Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.878Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.907Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.935Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.958Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:07.980Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:08.004Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:08.084Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:08.113Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:08.134Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:08.163Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:08.195Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:09.293Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:09.312Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:09.367Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-11_07_07_59-5722825513126140965 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:17.975Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:08:51.237Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:09:23.762Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:09:34.618Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:51:30.703Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:53:46.160Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T15:54:38.560Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T16:23:40.595Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T16:24:41.724Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T16:34:46.978Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T16:54:48.941Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T16:56:57.361Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T17:07:41.647Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T17:10:51.991Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T17:19:47.804Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T17:33:48.852Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T17:34:51.253Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T17:52:52.209Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T17:57:52.809Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T18:09:04.010Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T18:24:06.526Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T18:28:10.052Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T18:34:11.844Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T18:45:06.571Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T18:56:08.323Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T19:07:20.270Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T19:18:12.533Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T19:32:24.496Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T19:39:25.223Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T19:54:17.594Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T19:55:29.898Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-11_07_07_59-5722825513126140965 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T20:01:03.270Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-11_07_07_59-5722825513126140965.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T20:01:03.308Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T20:01:03.370Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T20:01:03.401Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T20:01:03.443Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-11T20:01:03.463Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-11_07_07_59-5722825513126140965?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 6s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/x3itxzvv26toc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #900

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/900/display/redirect?page=changes>

Changes:

[noreply] fix doc on bq sxtorage (#25353)

[noreply] [Playground] Fix Test_getRunOrTestCmd on Go 1.20 (#25379)

[noreply] Fix typo - metdata -> metadata (#25399)

[noreply] [prism] Initial commit for READMEs and go.mod (#25404)

[noreply] [prism] Add urns package (#25405)

[noreply] Add Two Counter Metric in BigQuery Write Schema Transform (#25155)

[noreply] [prism] Add internal/config package (#25406)

[noreply] Fix typo in args (#25422)


------------------------------------------
[...truncated 26.67 KB...]
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.46.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.68
  Using cached botocore-1.29.68-py3-none-any.whl (10.4 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.13.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2994559 sha256=66033c5fcabd9c0dd5e8ea06b49359387a28369deef7f166b7778a1478fc1d13
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.68 botocore-1.29.68 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.1 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.68.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.6 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0210125408.1676042344.177408/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0210125408.1676042344.177408/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0210125408.1676042344.177408/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0210125408.1676042344.177408/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230210151904178402-2444'
 createTime: '2023-02-10T15:19:05.324937Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-10_07_19_04-10591076705871690114'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0210125408'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-10T15:19:05.324937Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-10_07_19_04-10591076705871690114]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-10_07_19_04-10591076705871690114
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-10_07_19_04-10591076705871690114?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-10_07_19_04-10591076705871690114 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:14.771Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:21.506Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:21.827Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:21.908Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:21.994Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.021Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.080Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.154Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.198Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.229Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.252Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.277Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.304Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.331Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.368Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.393Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.427Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.463Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.495Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.521Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.557Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.668Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.706Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.736Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.761Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:22.793Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:23.870Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:23.909Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:23.948Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-10_07_19_04-10591076705871690114 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:35.780Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:58.182Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:19:58.210Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:20:07.641Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:20:39.914Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T15:20:51.293Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T16:02:18.919Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T16:05:15.771Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T16:06:28.169Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T16:35:31.535Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T16:36:36.046Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T16:47:32.266Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T17:09:35.016Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T17:10:41.152Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T17:36:38.423Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T17:43:41.628Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T17:46:47.714Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T18:16:46.547Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T18:24:48.320Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T18:50:54.546Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T19:00:57.225Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T19:24:56.709Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T19:38:52.522Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T19:59:53.501Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T20:06:05.453Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-10_07_19_04-10591076705871690114 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T20:23:03.915Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-10_07_19_04-10591076705871690114.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T20:23:03.964Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T20:23:04.021Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T20:23:04.048Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T20:23:04.078Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-10T20:23:04.102Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-10_07_19_04-10591076705871690114?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5h 6m 16s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/us2yobpkpfy6g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #899

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/899/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Minor updates to typescript generation.

[noreply] Bump cryptography from 39.0.0 to 39.0.1 in /sdks/python/container/py39

[noreply] Playground Frontend Test workflow (#24728) (#25254)

[noreply] Simpify composite literal in metrics_test.go (#25384)

[noreply] Replace use of deprecated strings.Title function (#25385)

[noreply] Explicitly set mongo container version for testing (#25369)

[noreply] Update upper bound for numpy. (#24725)

[noreply] Support batching as config in RunInference (sklearn and pytorch)

[noreply] Add a note on increased lower bounds. (#25389)

[noreply] Support samza portable UDF metrics. (#25265)

[noreply] Adding support for @SchemaFieldDescription annotation that allows ann…

[noreply] Adding SpannerIO.readChangeStreams support for SchemaTransform (#24999)

[noreply] Bump google.golang.org/api from 0.108.0 to 0.109.0 in /sdks (#25250)

[noreply] Add min and max batch size args to model handler (#25395)

[noreply] Beam/sdks/io/gcp/java/healcare/hl7v2 io read (#25056)

[noreply] Remove reuse of GenericRecord instance when reading Avro from BigQuery

[noreply] fixed reading env variable (#25362)


------------------------------------------
[...truncated 2.75 KB...]
To honour the JVM settings for this build a single-use Daemon process will be forked. See https://docs.gradle.org/7.5.1/userguide/gradle_daemon.html#sec:disabling_the_daemon.
Daemon will be stopped at the end of the build 
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build

> Task :sdks:python:setupVirtualenv
Collecting pip
  Using cached pip-23.0-py3-none-any.whl (2.1 MB)
Installing collected packages: pip
  Attempting uninstall: pip
    Found existing installation: pip 20.1.1
    Uninstalling pip-20.1.1:
      Successfully uninstalled pip-20.1.1
Successfully installed pip-23.0
Ignoring grpcio: markers 'sys_platform == "darwin"' don't match your environment
Ignoring protobuf: markers 'python_version == "3.10" and sys_platform == "darwin"' don't match your environment
Collecting tox==3.20.1
  Using cached tox-3.20.1-py2.py3-none-any.whl (83 kB)
Requirement already satisfied: setuptools in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1922375555/lib/python3.7/site-packages> (from -r <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build-requirements.txt> (line 20)) (47.1.0)
Collecting setuptools
  Using cached setuptools-67.2.0-py3-none-any.whl (1.1 MB)
Collecting six
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting wheel>=0.36.0
  Using cached wheel-0.38.4-py3-none-any.whl (36 kB)
Collecting grpcio-tools==1.37.0
  Using cached grpcio_tools-1.37.0-cp37-cp37m-manylinux2014_x86_64.whl (2.5 MB)
Collecting mypy-protobuf==1.18
  Using cached mypy_protobuf-1.18-py3-none-any.whl (7.3 kB)
Collecting distlib==0.3.1
  Using cached distlib-0.3.1-py2.py3-none-any.whl (335 kB)
Collecting numpy<1.25,>=1.14.3
  Using cached numpy-1.21.6-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.7 MB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.19.0-py3-none-any.whl (8.7 MB)
Collecting filelock>=3.0.0
  Using cached filelock-3.9.0-py3-none-any.whl (9.7 kB)
Collecting toml>=0.9.4
  Using cached toml-0.10.2-py2.py3-none-any.whl (16 kB)
Collecting packaging>=14
  Using cached packaging-23.0-py3-none-any.whl (42 kB)
Collecting importlib-metadata<3,>=0.12
  Using cached importlib_metadata-2.1.3-py2.py3-none-any.whl (10 kB)
Collecting pluggy>=0.12.0
  Using cached pluggy-1.0.0-py2.py3-none-any.whl (13 kB)
Collecting py>=1.4.17
  Using cached py-1.11.0-py2.py3-none-any.whl (98 kB)
Collecting grpcio>=1.37.0
  Using cached grpcio-1.51.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.8 MB)
Collecting protobuf<4.0dev,>=3.5.0.post1
  Using cached protobuf-3.20.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.0 MB)
Collecting zipp>=0.5
  Using cached zipp-3.12.1-py3-none-any.whl (6.7 kB)
Collecting platformdirs<4,>=2.4
  Using cached platformdirs-3.0.0-py3-none-any.whl (14 kB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.18.0-py3-none-any.whl (8.7 MB)
Collecting platformdirs<3,>=2.4
  Using cached platformdirs-2.6.2-py3-none-any.whl (14 kB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.17.1-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.17.0-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.7-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.6-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.5-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.4-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.3-py2.py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.2-py2.py3-none-any.whl (8.8 MB)
Collecting typing-extensions>=4.4
  Using cached typing_extensions-4.4.0-py3-none-any.whl (26 kB)
Installing collected packages: distlib, zipp, wheel, typing-extensions, toml, six, setuptools, py, protobuf, packaging, numpy, grpcio, filelock, platformdirs, mypy-protobuf, importlib-metadata, grpcio-tools, virtualenv, pluggy, tox
  Attempting uninstall: setuptools
    Found existing installation: setuptools 47.1.0
    Uninstalling setuptools-47.1.0:
      Successfully uninstalled setuptools-47.1.0
Successfully installed distlib-0.3.1 filelock-3.9.0 grpcio-1.51.1 grpcio-tools-1.37.0 importlib-metadata-2.1.3 mypy-protobuf-1.18 numpy-1.21.6 packaging-23.0 platformdirs-2.6.2 pluggy-1.0.0 protobuf-3.20.3 py-1.11.0 setuptools-67.2.0 six-1.16.0 toml-0.10.2 tox-3.20.1 typing-extensions-4.4.0 virtualenv-20.16.2 wheel-0.38.4 zipp-3.12.1

> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv
Collecting pip
  Using cached pip-23.0-py3-none-any.whl (2.1 MB)
Installing collected packages: pip
  Attempting uninstall: pip
    Found existing installation: pip 20.1.1
    Uninstalling pip-20.1.1:
      Successfully uninstalled pip-20.1.1
Successfully installed pip-23.0
Ignoring grpcio: markers 'sys_platform == "darwin"' don't match your environment
Ignoring protobuf: markers 'python_version == "3.10" and sys_platform == "darwin"' don't match your environment
Collecting tox==3.20.1
  Using cached tox-3.20.1-py2.py3-none-any.whl (83 kB)
Requirement already satisfied: setuptools in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from -r <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build-requirements.txt> (line 20)) (47.1.0)
Collecting setuptools
  Using cached setuptools-67.2.0-py3-none-any.whl (1.1 MB)
Collecting six
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting wheel>=0.36.0
  Using cached wheel-0.38.4-py3-none-any.whl (36 kB)
Collecting grpcio-tools==1.37.0
  Using cached grpcio_tools-1.37.0-cp37-cp37m-manylinux2014_x86_64.whl (2.5 MB)
Collecting mypy-protobuf==1.18
  Using cached mypy_protobuf-1.18-py3-none-any.whl (7.3 kB)
Collecting distlib==0.3.1
  Using cached distlib-0.3.1-py2.py3-none-any.whl (335 kB)
Collecting numpy<1.25,>=1.14.3
  Using cached numpy-1.21.6-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.7 MB)
Collecting toml>=0.9.4
  Using cached toml-0.10.2-py2.py3-none-any.whl (16 kB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.19.0-py3-none-any.whl (8.7 MB)
Collecting py>=1.4.17
  Using cached py-1.11.0-py2.py3-none-any.whl (98 kB)
Collecting importlib-metadata<3,>=0.12
  Using cached importlib_metadata-2.1.3-py2.py3-none-any.whl (10 kB)
Collecting packaging>=14
  Using cached packaging-23.0-py3-none-any.whl (42 kB)
Collecting pluggy>=0.12.0
  Using cached pluggy-1.0.0-py2.py3-none-any.whl (13 kB)
Collecting filelock>=3.0.0
  Using cached filelock-3.9.0-py3-none-any.whl (9.7 kB)
Collecting protobuf<4.0dev,>=3.5.0.post1
  Using cached protobuf-3.20.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.0 MB)
Collecting grpcio>=1.37.0
  Using cached grpcio-1.51.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.8 MB)
Collecting zipp>=0.5
  Using cached zipp-3.12.1-py3-none-any.whl (6.7 kB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.18.0-py3-none-any.whl (8.7 MB)
  Using cached virtualenv-20.17.1-py3-none-any.whl (8.8 MB)
Collecting platformdirs<3,>=2.4
  Using cached platformdirs-2.6.2-py3-none-any.whl (14 kB)
Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.17.0-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.7-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.6-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.5-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.4-py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.3-py2.py3-none-any.whl (8.8 MB)
  Using cached virtualenv-20.16.2-py2.py3-none-any.whl (8.8 MB)
Collecting typing-extensions>=4.4
  Using cached typing_extensions-4.4.0-py3-none-any.whl (26 kB)
Installing collected packages: distlib, zipp, wheel, typing-extensions, toml, six, setuptools, py, protobuf, packaging, numpy, grpcio, filelock, platformdirs, mypy-protobuf, importlib-metadata, grpcio-tools, virtualenv, pluggy, tox
  Attempting uninstall: setuptools
    Found existing installation: setuptools 47.1.0
    Uninstalling setuptools-47.1.0:
      Successfully uninstalled setuptools-47.1.0
Successfully installed distlib-0.3.1 filelock-3.9.0 grpcio-1.51.1 grpcio-tools-1.37.0 importlib-metadata-2.1.3 mypy-protobuf-1.18 numpy-1.21.6 packaging-23.0 platformdirs-2.6.2 pluggy-1.0.0 protobuf-3.20.3 py-1.11.0 setuptools-67.2.0 six-1.16.0 toml-0.10.2 tox-3.20.1 typing-extensions-4.4.0 virtualenv-20.16.2 wheel-0.38.4 zipp-3.12.1

> Task :sdks:python:sdist
org/apache/beam/model/interactive/v1/beam_interactive_api.proto:36:1: warning: Import google/protobuf/timestamp.proto is unused.
Writing mypy to org/apache/beam/model/pipeline/v1/external_transforms_pb2.pyi
Writing mypy to org/apache/beam/model/pipeline/v1/beam_runner_api_pb2.pyi
Writing mypy to org/apache/beam/model/pipeline/v1/standard_window_fns_pb2.pyi
Writing mypy to org/apache/beam/model/pipeline/v1/metrics_pb2.pyi
Writing mypy to org/apache/beam/model/pipeline/v1/endpoints_pb2.pyi
Writing mypy to org/apache/beam/model/pipeline/v1/schema_pb2.pyi
Writing mypy to org/apache/beam/model/job_management/v1/beam_artifact_api_pb2.pyi
Writing mypy to org/apache/beam/model/job_management/v1/beam_expansion_api_pb2.pyi
Writing mypy to org/apache/beam/model/job_management/v1/beam_job_api_pb2.pyi
Writing mypy to org/apache/beam/model/fn_execution/v1/beam_provision_api_pb2.pyi
Writing mypy to org/apache/beam/model/fn_execution/v1/beam_fn_api_pb2.pyi
Writing mypy to org/apache/beam/model/interactive/v1/beam_interactive_api_pb2.pyi
<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1922375555/lib/python3.7/site-packages/setuptools/dist.py>:534: UserWarning: Normalizing '2.46.0.dev' to '2.46.0.dev0'
  warnings.warn(tmpl.format(**locals()))
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: no files found matching 'LICENSE.python'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
Processing <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz>
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
Requirement already satisfied: protobuf<4,>3.12.2 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from apache-beam==2.46.0.dev0) (3.20.3)
Collecting crcmod<2.0,>=1.7
  Using cached crcmod-1.7-cp37-cp37m-linux_x86_64.whl
Collecting orjson<4.0
  Using cached orjson-3.8.6.tar.gz (655 kB)
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'error'
  error: subprocess-exited-with-error
  
  × Preparing metadata (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [6 lines of output]
      
      Cargo, the Rust package manager, is not installed or is not on PATH.
      This package requires Rust and Cargo to compile extensions. Install it through
      the system's package manager or via https://rustup.rs/
      
      Checking for Rust toolchain....
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 37s
14 actionable tasks: 8 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qtu6agqhr2fo4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #898

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/898/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Better batching for higher fixed costs.

[Robert Bradshaw] Add an option to get the old behavior.

[Robert Bradshaw] lint

[noreply] [BEAM-12164] Enforced only positive state transitions from CREATED ->

[noreply] Added Role-based access control integration tests for Spanner Change

[noreply] Fail the pipeline when a mismatched Python or Beam version is detected.

[noreply] [Spark runner] Removal of Spark 2 runner support (closes #25259)

[noreply] Add TensorRT runinference example for Text Classification (#25226)


------------------------------------------
[...truncated 26.62 KB...]
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting botocore<1.30.0,>=1.29.66
  Using cached botocore-1.29.66-py3-none-any.whl (10.4 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.1)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2993882 sha256=b1019ce2fe3136efeb58411238b83d45e6a5f20aa763867e3584a72132655469
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.66 botocore-1.29.66 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.1 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.67.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0208150205.1675868876.383180/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0208150205.1675868876.383180/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0208150205.1675868876.383180/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0208150205.1675868876.383180/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230208150756384186-2252'
 createTime: '2023-02-08T15:07:57.752400Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-08_07_07_57-11897182110180530087'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0208150205'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-08T15:07:57.752400Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-08_07_07_57-11897182110180530087]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-08_07_07_57-11897182110180530087
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-08_07_07_57-11897182110180530087?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-08_07_07_57-11897182110180530087 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:22.140Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:28.461Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.456Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.532Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.603Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.631Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.673Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.726Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.761Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.817Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.850Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.881Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.915Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.952Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:29.980Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.014Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.047Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.073Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.105Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.129Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.152Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.242Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.271Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.301Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.336Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:30.369Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:31.431Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:31.458Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:31.491Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-08_07_07_57-11897182110180530087 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:08:51.108Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:09:16.046Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:09:16.081Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:09:24.577Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:09:42.778Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:09:50.031Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:50:31.921Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:53:23.402Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T15:54:25.952Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T16:14:38.310Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T16:23:28.276Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T16:37:39.300Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T16:46:34.404Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T16:49:45.485Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T17:00:47.759Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T17:11:47.222Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T17:12:42.355Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T17:27:52.968Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T17:34:54.724Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T17:52:55.903Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T18:01:00.345Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T18:28:54.969Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T18:51:04.820Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T19:04:57.170Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T19:26:08.178Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T19:40:09.766Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-08_07_07_57-11897182110180530087 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T20:00:52.900Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-08_07_07_57-11897182110180530087.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T20:00:52.940Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T20:00:52.992Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T20:00:53.021Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T20:00:53.042Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-08T20:00:53.098Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-08_07_07_57-11897182110180530087?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 16s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2c57paccv4eqy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #897

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/897/display/redirect?page=changes>

Changes:

[Andrew Pilloud] Don't discard output column names

[noreply] delete file used in internal testing (#25339)

[noreply] Fix output timestamp for multi output receiver in FnApiDoFnRunner #25344

[noreply] Change UnboundedScheduledExecutorService to avoid creating threads when

[noreply] Upgrading spring-expression to latest patch version (#25348)


------------------------------------------
[...truncated 26.33 KB...]
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.46.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.65
  Using cached botocore-1.29.65-py3-none-any.whl (10.4 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.1)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2991127 sha256=58bcec2a763c9ce95b65705e4f1f6052e5b80e573e135160f55d234b1d06a45c
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.65 botocore-1.29.65 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.67.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0207150209.1675782488.253217/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0207150209.1675782488.253217/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0207150209.1675782488.253217/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0207150209.1675782488.253217/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230207150808254465-3566'
 createTime: '2023-02-07T15:08:09.438702Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-07_07_08_08-7270676574769881536'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0207150209'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-07T15:08:09.438702Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-07_07_08_08-7270676574769881536]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-07_07_08_08-7270676574769881536
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-07_07_08_08-7270676574769881536?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-07_07_08_08-7270676574769881536 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:19.476Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:24.866Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:24.903Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:24.957Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.029Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.056Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.112Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.166Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.195Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.220Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.240Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.264Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.286Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.324Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.356Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.389Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.411Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.435Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.457Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.477Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.499Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.568Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.602Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.627Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.656Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:25.688Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:26.745Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:26.777Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:26.807Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-07_07_08_08-7270676574769881536 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:08:33.041Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:09:10.618Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:09:43.766Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:09:53.876Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:50:22.263Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:51:19.509Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:54:29.400Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T15:55:25.591Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T16:21:34.393Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T16:34:39.121Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T16:35:50.762Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T16:52:39.908Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T16:59:43.818Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T17:14:45.667Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T17:26:47.471Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T17:40:51.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T17:52:50.420Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T18:05:55.863Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T18:27:06.653Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T18:41:59.838Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T19:01:14.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T19:10:05.695Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T19:34:03.808Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T19:45:00.389Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T19:59:08.209Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-07_07_08_08-7270676574769881536 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T20:00:48.098Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-07_07_08_08-7270676574769881536.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T20:00:48.134Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T20:00:48.186Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T20:00:48.213Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T20:00:48.241Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-07T20:00:48.264Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-07_07_08_08-7270676574769881536?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 47s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fdexpdha57zvs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #896

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/896/display/redirect?page=changes>

Changes:

[noreply] Clarify llm download/loading instructions (#25145)

[noreply] Allow for setMaxRetryJobs in BigQueryIO to be configurable (#25224)


------------------------------------------
[...truncated 26.89 KB...]
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.64
  Using cached botocore-1.29.64-py3-none-any.whl (10.4 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.1)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2991127 sha256=d276788f9b61572d891203dae58af9b7c0e11214d3ad8bdcddff9aec49d7a40e
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.64 botocore-1.29.64 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.1 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.67.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0206150209.1675696091.280459/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0206150209.1675696091.280459/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0206150209.1675696091.280459/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0206150209.1675696091.280459/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230206150811281437-1470'
 createTime: '2023-02-06T15:08:12.445589Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-06_07_08_11-1814280798881489135'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0206150209'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-06T15:08:12.445589Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-06_07_08_11-1814280798881489135]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-06_07_08_11-1814280798881489135
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-06_07_08_11-1814280798881489135?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-06_07_08_11-1814280798881489135 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:18.956Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.211Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.240Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.316Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.380Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.409Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.476Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.539Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.577Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.606Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.634Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.660Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.693Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.720Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.755Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.776Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.802Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.831Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.857Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.890Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:20.916Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:21.011Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:21.039Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:21.077Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:21.112Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:21.146Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:22.216Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:22.262Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:22.285Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-06_07_08_11-1814280798881489135 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:08:28.845Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:09:06.055Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:09:35.187Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:09:47.288Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:50:23.168Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:52:21.753Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T15:53:23.323Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T16:21:24.076Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T16:22:25.982Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T16:38:28.828Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T16:47:30.533Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T16:54:35.312Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T17:02:37.125Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T17:12:29.205Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T17:23:30.894Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T17:29:41.985Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T17:37:34.036Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T17:47:45.293Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T17:55:46.958Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T17:57:49.543Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T18:11:50.975Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T18:22:55.778Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T18:28:57.089Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T18:36:58.539Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T18:47:53.986Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T18:57:56.506Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T19:05:06.641Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T19:12:08.077Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T19:23:09.718Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T19:32:11.812Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T19:40:17.575Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T19:46:16.295Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T19:58:19.476Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-06_07_08_11-1814280798881489135 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T20:00:53.613Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-06_07_08_11-1814280798881489135.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T20:00:53.636Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T20:00:53.690Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T20:00:53.708Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T20:00:53.740Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-06T20:00:53.759Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-06_07_08_11-1814280798881489135?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 20s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/oiqdotfzzzzwy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #895

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/895/display/redirect?page=changes>

Changes:

[noreply] Basic SchemaTransform implementation for SQLTransform. (#25177)

[noreply] issue24170 google colab link added (#24820)


------------------------------------------
[...truncated 26.65 KB...]
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.64
  Using cached botocore-1.29.64-py3-none-any.whl (10.4 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2991127 sha256=fff15161ade789b1efc3f4f814b139a864b596897a0af5d3d56e883d5af70aad
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.64 botocore-1.29.64 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.0 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.67.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.1 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0205150214.1675609670.361550/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0205150214.1675609670.361550/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0205150214.1675609670.361550/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0205150214.1675609670.361550/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230205150750362481-2132'
 createTime: '2023-02-05T15:07:51.533865Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-05_07_07_51-9648154825978406387'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0205150214'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-05T15:07:51.533865Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-05_07_07_51-9648154825978406387]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-05_07_07_51-9648154825978406387
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-05_07_07_51-9648154825978406387?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-05_07_07_51-9648154825978406387 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:07:58.196Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:07:59.792Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:07:59.819Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:07:59.873Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:07:59.929Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:07:59.963Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.008Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.073Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.110Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.152Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.180Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.206Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.236Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.262Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.294Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.332Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.364Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.388Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.415Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.451Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.479Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.586Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.641Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.667Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.696Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:00.722Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-05_07_07_51-9648154825978406387 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:01.789Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:01.816Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:01.852Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:14.238Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:08:45.074Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:09:16.081Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:09:28.134Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:51:06.473Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:51:31.683Z: JOB_MESSAGE_ERROR: generic::cancelled: Data channel closed, unable to receive additional data from SDK sdk-0-1
with MessageCode:
(b5470124f402c16a): SDK disconnect.
passed through:
==>
    dist_proc/dax/workflow/****/fnapi_data_service.cc:371
generic::aborted: SDK harness sdk-0-0  disconnected.
with MessageCode:
(2f0fd5907d64dde5): SDK disconnect.
passed through:
==>
    dist_proc/dax/workflow/****/fnapi_control_service.cc:217
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:52:03.278Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T15:54:04.211Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T16:27:06.636Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T16:32:11.616Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T16:51:12.342Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T17:00:13.257Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T17:22:18.537Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T17:27:19.104Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T17:46:20.579Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T18:01:24.789Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T18:12:27.091Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T18:35:28.926Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T18:40:30.945Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T19:06:33.434Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T19:16:34.951Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T19:38:36.894Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-05T19:52:38.568Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-05_07_07_51-9648154825978406387 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-05_07_07_51-9648154825978406387?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 54m 31s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fif2lsx6f6ndy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #894

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/894/display/redirect?page=changes>

Changes:

[noreply] 24472 Implement FileWriteSchemaTransformProvider (#24806)

[noreply] Embed ML video to docs (#25302)

[noreply] skip automated expansion test (#25304)

[noreply] Add banner highlighting beam ml (#25306)

[noreply] Optimize to use cached output receiver instead of creating one on DoFn

[noreply] Optimize PGBK table to only update cache when there is a large enough

[noreply] Swap setting a context from being on the hot path when we emit elements


------------------------------------------
[...truncated 26.11 KB...]
  Using cached google_cloud_vision-3.3.1-py2.py3-none-any.whl (393 kB)
Collecting google-cloud-recommendations-ai<0.8.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.7.1-py2.py3-none-any.whl (148 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.46.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.64
  Using cached botocore-1.29.64-py3-none-any.whl (10.4 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.0-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2991127 sha256=1075044fc8a414cb53c8d1fc9f1f09c5842b3c988b71e8614c2b0a89ded60b45
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.64 botocore-1.29.64 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.0 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.66.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.0 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0204150204.1675523268.290953/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0204150204.1675523268.290953/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0204150204.1675523268.290953/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0204150204.1675523268.290953/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230204150748291983-6714'
 createTime: '2023-02-04T15:07:49.519860Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-04_07_07_48-15200723087970190708'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0204150204'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-04T15:07:49.519860Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-04_07_07_48-15200723087970190708]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-04_07_07_48-15200723087970190708
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-04_07_07_48-15200723087970190708?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-04_07_07_48-15200723087970190708 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:07:55.320Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:01.817Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:06.842Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:11.969Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.047Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.077Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.136Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.205Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.251Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.292Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.319Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.341Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.379Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.412Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.434Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.466Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.500Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.524Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.544Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.566Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.599Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.707Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.761Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.792Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.824Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:12.858Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:13.942Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:13.972Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:14.006Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-04_07_07_48-15200723087970190708 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:33.847Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:08:58.562Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:09:29.158Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:09:41.126Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:52:15.463Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:54:16.674Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T15:55:17.403Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T16:23:18.974Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T16:26:22.220Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T16:47:23.955Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T16:53:26.360Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T17:10:28.629Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T17:19:37.839Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T17:43:31.058Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T17:56:33.455Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T18:18:27.767Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T18:32:38.657Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T18:44:43.467Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T19:08:44.643Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T19:19:46.767Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T19:41:39.465Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T19:55:51.398Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-04_07_07_48-15200723087970190708 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T20:01:17.667Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-04_07_07_48-15200723087970190708.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T20:01:17.692Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T20:01:17.740Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T20:01:17.767Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T20:01:17.789Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-04T20:01:17.811Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-04_07_07_48-15200723087970190708?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 17s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dxv6ggqpapdy6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #893

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/893/display/redirect?page=changes>

Changes:

[noreply] Add sideinputs to the RunInference Transform (#25200)

[noreply] Move changes to correct release in CHANGES.md (#25288)

[noreply] Pass instead of raising an error (#25287)

[noreply] [WebSite] Add new Python quickstart (#24804)

[noreply] [CdapIO] Implement windowed write (#25206)

[noreply] [Spark Dataset runner] Fix collection encoder bug that may lead to

[noreply] [Spark Dataset runner] Break linage of dataset to reduce Spark planning

[noreply] Fix flaky test due to create bigquery dataset conflict (#25266)


------------------------------------------
[...truncated 26.01 KB...]
Collecting azure-core>=1.7.0
  Using cached azure_core-1.26.3-py3-none-any.whl (174 kB)
Collecting azure-identity>=1.12.0
  Using cached azure_identity-1.12.0-py3-none-any.whl (135 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.46.0.dev0) (1.16.0)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting botocore<1.30.0,>=1.29.63
  Using cached botocore-1.29.63-py3-none-any.whl (10.4 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.0-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2991127 sha256=115ae507dab1becce4bc4fe3309a9beff0a154b5f24045e34f2f66eef9bde91a
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.3 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.63 botocore-1.29.63 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.0 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.66.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.0 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0203150202.1675436873.893052/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0203150202.1675436873.893052/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0203150202.1675436873.893052/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0203150202.1675436873.893052/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230203150753894101-3654'
 createTime: '2023-02-03T15:07:54.986469Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-03_07_07_54-3954429928506136292'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0203150202'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-03T15:07:54.986469Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-03_07_07_54-3954429928506136292]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-03_07_07_54-3954429928506136292
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-03_07_07_54-3954429928506136292?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-03_07_07_54-3954429928506136292 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:01.939Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.363Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.399Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.454Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.511Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.539Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.605Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.652Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.700Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.735Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.764Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.786Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.817Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.841Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.862Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.887Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.911Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.956Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:03.982Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:04.016Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:04.039Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:04.164Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:04.211Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:04.231Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:04.252Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:04.282Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-03_07_07_54-3954429928506136292 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:05.365Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:05.405Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:05.428Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:37.993Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:08:47.504Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:09:18.945Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:09:30.486Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:53:09.589Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:55:16.632Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T15:56:03.715Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T16:25:00.190Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T16:26:10.644Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T16:49:05.573Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T17:01:14.811Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T17:22:18.135Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T17:38:21.967Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T17:57:14.190Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T18:12:26.455Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T18:31:27.631Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T18:39:33.238Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T19:04:25.994Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T19:06:37.830Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T19:38:39.327Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T19:45:32.529Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-03_07_07_54-3954429928506136292 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T20:01:01.576Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-03_07_07_54-3954429928506136292.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T20:01:01.624Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T20:01:01.708Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T20:01:01.738Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T20:01:01.783Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-03T20:01:01.805Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-03_07_07_54-3954429928506136292?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 15s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/gcbxeh5tyjuo6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #892

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/892/display/redirect?page=changes>

Changes:

[noreply] Bump torch (#25057)

[noreply] Fix XVR_Direct time out (#25247)

[noreply] Exclude changestream integration test on DataflowLegacyWorker (#25239)

[noreply] [Tour of Beam] [Task] Fix dependency management for 2.44 Playground java

[noreply] Bump github.com/aws/aws-sdk-go-v2/feature/s3/manager in /sdks (#25270)

[noreply] [Spark Runner] Add new experiment that provides concurrent bounded


------------------------------------------
[...truncated 25.98 KB...]
Collecting google-cloud-vision<4,>=2
  Using cached google_cloud_vision-3.3.1-py2.py3-none-any.whl (393 kB)
Collecting google-cloud-recommendations-ai<0.8.0,>=0.1.0
  Using cached google_cloud_recommendations_ai-0.7.1-py2.py3-none-any.whl (148 kB)
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from azure-core>=1.7.0->apache-beam==2.46.0.dev0) (1.16.0)
Collecting msal<2.0.0,>=1.12.0
  Using cached msal-1.21.0-py2.py3-none-any.whl (89 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting botocore<1.30.0,>=1.29.62
  Using cached botocore-1.29.62-py3-none-any.whl (10.4 MB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.0-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2985660 sha256=55bcc88a71b8632762c59ad32a7e3830e95c045da15157049b514c930bed64ca
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.2 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.62 botocore-1.29.62 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.5.0 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.0 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.66.0 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.0 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0202150208.1675350472.283635/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0202150208.1675350472.283635/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0202150208.1675350472.283635/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0202150208.1675350472.283635/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230202150752284601-7467'
 createTime: '2023-02-02T15:07:53.437189Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-02_07_07_52-10076104660800019792'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0202150208'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-02T15:07:53.437189Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-02_07_07_52-10076104660800019792]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-02_07_07_52-10076104660800019792
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-02_07_07_52-10076104660800019792?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-02_07_07_52-10076104660800019792 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:01.592Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.241Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.268Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.335Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.508Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.550Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.606Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.670Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.728Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.759Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.796Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.836Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.867Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.906Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.939Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:03.973Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.007Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.037Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.065Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.098Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.126Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.240Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.279Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.315Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.345Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:04.372Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:05.468Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:05.491Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:05.553Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-02_07_07_52-10076104660800019792 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:21.622Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:08:49.081Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:09:18.826Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:09:31.096Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:49:00.506Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:53:17.275Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T15:54:09.062Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T16:21:12.134Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T16:23:22.117Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T16:51:14.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T16:53:05.872Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T17:22:18.085Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T17:33:20.215Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T17:55:23.258Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T18:09:34.774Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T18:30:28.112Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T18:45:19.224Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T19:06:31.609Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T19:20:33.661Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T19:40:35.397Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T19:55:37.043Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-02_07_07_52-10076104660800019792 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T20:00:53.154Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-02_07_07_52-10076104660800019792.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T20:00:53.184Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T20:00:53.241Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T20:00:53.272Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T20:00:53.298Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-02T20:00:53.320Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-02_07_07_52-10076104660800019792?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 13s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/p6jqdho6mx6cq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #891

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/891/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Setup typedoc for doc generation.

[Robert Bradshaw] Quick pass adding some typescript docs and pointers.

[Robert Bradshaw] Add typescript doc gen to release process.

[Robert Bradshaw] Split README into user-facing and dev-facing portions.

[noreply] Attempt fix GCPIO_Direct tests timeout (#25209)

[noreply] Fix pulling licenses (#25234)

[Robert Bradshaw] Deterministic ordering of gbk outputs for testing.

[Robert Bradshaw] Increase timeouts for cross-langauge tests.

[noreply] Ignore flags for beam_sql magic (#25210)

[noreply] Stop publishing empty test-only artifacts (#25191)

[noreply] Fix Debezium expansion service fail to start (#25243)

[noreply] Merge pull request #25094: Externalizing the StreamWriter parameters for

[noreply] Update/add torch versions to tox.ini (#25045)

[noreply] Add support for templates in task hints (#25214)


------------------------------------------
[...truncated 27.01 KB...]
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.61
  Using cached botocore-1.29.61-py3-none-any.whl (10.4 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.0-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2985660 sha256=88293336a34e09627552c4ed004ec2ec7144cbbf320040ca93d7d751a30e1c66
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.2 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.61 botocore-1.29.61 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.4.2 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.0 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.65.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.21.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.0 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0201150225.1675264067.039698/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0201150225.1675264067.039698/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0201150225.1675264067.039698/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0201150225.1675264067.039698/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230201150747040822-6574'
 createTime: '2023-02-01T15:07:48.228505Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-02-01_07_07_47-13373062827315730904'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0201150225'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-02-01T15:07:48.228505Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-02-01_07_07_47-13373062827315730904]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-02-01_07_07_47-13373062827315730904
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-01_07_07_47-13373062827315730904?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-01_07_07_47-13373062827315730904 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:07:57.953Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:07:59.505Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:07:59.554Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:07:59.635Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:07:59.748Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:07:59.784Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:07:59.864Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:07:59.963Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.025Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.059Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.096Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.184Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.224Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.258Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.303Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.337Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.386Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.433Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.466Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.515Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.567Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.760Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.847Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.910Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.955Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:00.991Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:02.161Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:02.213Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:02.297Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-01_07_07_47-13373062827315730904 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:16.073Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:48.465Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:48.498Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 1, though goal was 5.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:08:58.339Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:09:24.107Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:09:35.895Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:52:00.238Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:54:57.901Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T15:55:58.363Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T16:24:01.406Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T16:25:07.829Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T16:43:08.872Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T16:57:21.388Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T16:58:21.162Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T17:17:19.367Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T17:29:18.625Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T17:31:24.366Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T17:52:23.062Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T17:53:31.277Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T18:05:36.994Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T18:10:23.802Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T18:26:26.008Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T18:34:37.502Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T18:40:38.708Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T18:58:40.628Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T19:02:52.915Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T19:17:26.127Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T19:20:58.050Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T19:31:53.191Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T19:39:54.235Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T19:44:46.180Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T19:59:54.291Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-02-01_07_07_47-13373062827315730904 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T20:00:50.696Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-02-01_07_07_47-13373062827315730904.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T20:00:50.732Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T20:00:50.791Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T20:00:50.819Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T20:00:50.843Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-02-01T20:00:50.877Z: JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-02-01_07_07_47-13373062827315730904?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wwztnnou5ogbi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #890

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/890/display/redirect?page=changes>

Changes:

[noreply] Fix Jdbc Write after window assigned (#25173)

[noreply] Update Github issues link for Go SDK (#25161)

[noreply] Fix typo. (#21864)

[noreply] Standardizing naming and URN for Pubsub Read Schema Transform (#25170)


------------------------------------------
[...truncated 26.72 KB...]
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting botocore<1.30.0,>=1.29.60
  Using cached botocore-1.29.60-py3-none-any.whl (10.4 MB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.0-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2985565 sha256=e5220245218ffd38cc12363d272fee7f3055026a5ef3688477aa4cce25d66d39
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.2 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.60 botocore-1.29.60 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.4.2 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.0 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.65.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.20.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.0 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0131150233.1675177672.381597/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0131150233.1675177672.381597/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0131150233.1675177672.381597/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0131150233.1675177672.381597/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230131150752382594-2062'
 createTime: '2023-01-31T15:07:53.620202Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-01-31_07_07_53-7103378790049848501'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0131150233'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-01-31T15:07:53.620202Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-01-31_07_07_53-7103378790049848501]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-01-31_07_07_53-7103378790049848501
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-01-31_07_07_53-7103378790049848501?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-01-31_07_07_53-7103378790049848501 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:01.121Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:03.546Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:03.583Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:03.665Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:03.735Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:03.764Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:03.852Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:03.914Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:03.952Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.006Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.037Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.077Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.127Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.168Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.206Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.232Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.276Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.323Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.350Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.371Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.398Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.534Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.583Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.612Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.650Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:04.672Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:05.790Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:05.828Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:05.866Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-01-31_07_07_53-7103378790049848501 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:36.706Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:08:48.927Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:09:19.668Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:09:31.678Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:50:10.632Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T15:54:06.895Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T16:01:05.971Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T16:24:09.063Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T16:25:10.572Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T16:48:14.285Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T16:57:15.824Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T17:00:18.006Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T17:12:20.140Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T17:22:15.128Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T17:32:25.230Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T17:40:17.043Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T17:47:28.268Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T17:58:29.877Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T18:09:35.236Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T18:16:34.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T18:23:29.726Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T18:34:41.831Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T18:45:44.033Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T18:52:36.639Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T18:59:44.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T19:10:46.377Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T19:11:52.286Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T19:26:52.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T19:37:52.885Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T19:44:56.081Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T19:52:57.888Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T20:00:46.335Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-01-31_07_07_53-7103378790049848501.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T20:00:46.368Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T20:00:46.436Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T20:00:46.454Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T20:00:46.483Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-31T20:00:46.502Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-01-31_07_07_53-7103378790049848501 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-01-31_07_07_53-7103378790049848501?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/2yyt57onyg5yq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_LoadTests_Python_Combine_Dataflow_Streaming #889

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/889/display/redirect>

Changes:


------------------------------------------
[...truncated 26.48 KB...]
  Using cached msal-1.20.0-py2.py3-none-any.whl (90 kB)
Collecting msal-extensions<2.0.0,>=0.3.0
  Using cached msal_extensions-1.0.0-py2.py3-none-any.whl (19 kB)
Collecting msrest>=0.7.1
  Using cached msrest-0.7.1-py3-none-any.whl (85 kB)
Collecting jmespath<2.0.0,>=0.7.1
  Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0
  Using cached s3transfer-0.6.0-py3-none-any.whl (79 kB)
Collecting botocore<1.30.0,>=1.29.59
  Using cached botocore-1.29.59-py3-none-any.whl (10.4 MB)
Collecting cffi>=1.12
  Using cached cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12
  Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.9-py3-none-any.whl (34 kB)
Requirement already satisfied: packaging>=20.0.0 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from google-cloud-bigquery<4,>=1.6.0->apache-beam==2.46.0.dev0) (23.0)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.4.1-py2.py3-none-any.whl (77 kB)
Collecting google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
  Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.51.1-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1
  Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0
  Using cached sqlparse-0.4.3-py3-none-any.whl (42 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0
  Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0
  Using cached exceptiongroup-1.1.0-py3-none-any.whl (14 kB)
Collecting pbr>=0.11
  Using cached pbr-5.11.1-py2.py3-none-any.whl (112 kB)
Collecting tomli>=1.0.0
  Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Collecting iniconfig
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (1.0.0)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (2.1.3)
Collecting pytest-forked
  Using cached pytest_forked-1.4.0-py3-none-any.whl (4.9 kB)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
  Using cached urllib3-1.26.14-py2.py3-none-any.whl (140 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (170 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting threadpoolctl>=2.0.0
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting scipy>=1.1.0
  Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting greenlet!=0.4.17
  Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB)
Collecting docker>=4.0.0
  Using cached docker-6.0.1-py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.5.0-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
  Using cached googleapis_common_protos-1.58.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB)
Collecting grpcio-status>=1.33.2
  Using cached grpcio_status-1.50.0-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.49.1-py3-none-any.whl (14 kB)
  Using cached grpcio_status-1.48.2-py3-none-any.whl (14 kB)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.46.0.dev0) (3.12.0)
Collecting PyJWT[crypto]<3,>=1.0.0
  Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: py in <https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from pytest-forked->pytest-xdist<3,>=2.5.0->apache-beam==2.46.0.dev0) (1.11.0)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.46.0.dev0-py3-none-any.whl size=2985565 sha256=8f06de9c354306526516c9c60591cf694d26de74b16c800fe8a7960d0c4ee1fa
  Stored in directory: /home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyasn1, parameterized, docopt, crcmod, charset-normalizer, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, rsa, regex, pyyaml, python-dateutil, pyparsing, pymysql, pymongo, PyJWT, pyhamcrest, pycparser, pyasn1-modules, pyarrow, psycopg2-binary, proto-plus, portalocker, pbr, overrides, orjson, objsize, oauthlib, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dill, deprecation, cloudpickle, certifi, cachetools, attrs, sqlalchemy, scikit-learn, requests, pydot, pandas, mock, hypothesis, httplib2, grpcio-status, google-resumable-media, google-auth, freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, pytest, oauth2client, hdfs, grpc-google-iam-v1, google-auth-httplib2, google-api-core, docker, cryptography, azure-core, testcontainers, pytest-timeout, pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, pytest-xdist, msal, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, msal-extensions, google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.6.0 apache-beam-2.46.0.dev0 attrs-22.2.0 azure-core-1.26.2 azure-identity-1.12.0 azure-storage-blob-12.14.1 boto3-1.26.59 botocore-1.29.59 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.0.1 cloudpickle-2.2.1 crcmod-1.7 cryptography-39.0.0 deprecation-2.1.0 dill-0.3.1.1 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.0 execnet-1.9.0 fastavro-1.7.1 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.16.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.4.2 google-cloud-bigquery-storage-2.16.2 google-cloud-bigtable-1.7.3 google-cloud-core-2.3.2 google-cloud-datastore-1.15.5 google-cloud-dlp-3.11.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.14.0 google-cloud-pubsublite-1.6.0 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-3.27.0 google-cloud-videointelligence-1.16.3 google-cloud-vision-3.3.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.58.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.48.2 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.65.2 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-2.0.0 msal-1.20.0 msal-extensions-1.0.0 msrest-0.7.1 oauth2client-4.1.3 oauthlib-3.2.2 objsize-0.6.1 orjson-3.8.5 overrides-6.5.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.11.1 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.5 pyarrow-9.0.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.13.0 pymysql-1.0.2 pyparsing-3.0.9 pytest-7.2.1 pytest-forked-1.4.0 pytest-timeout-2.1.0 pytest-xdist-2.5.0 python-dateutil-2.8.2 pytz-2022.7.1 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests-oauthlib-1.3.1 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.46 sqlparse-0.4.3 tenacity-5.1.5 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.14 websocket-client-1.5.0 wrapt-1.14.1 zstandard-0.19.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.46.0.dev
INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126
INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230126" for Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0130150240.1675091271.911819/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0130150240.1675091271.911819/dataflow_python_sdk.tar in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0130150240.1675091271.911819/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0130150240.1675091271.911819/pipeline.pb in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230130150751912817-1118'
 createTime: '2023-01-30T15:07:53.125344Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-01-30_07_07_52-2757792807318412129'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-combine-1-0130150240'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-01-30T15:07:53.125344Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-01-30_07_07_52-2757792807318412129]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-01-30_07_07_52-2757792807318412129
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-01-30_07_07_52-2757792807318412129?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-01-30_07_07_52-2757792807318412129 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:07:59.986Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.339Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.370Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.423Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.482Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.513Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.577Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.644Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.686Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.724Z: JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.745Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.765Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.809Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.835Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.867Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top 0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.903Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/GroupByKey/WriteStream into Combine with Top 0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.925Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.948Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:01.972Z: JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:02.004Z: JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:02.036Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:02.123Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:02.164Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:02.195Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:02.228Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:02.262Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:03.324Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-01-30_07_07_52-2757792807318412129 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:03.364Z: JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:03.401Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:14.658Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:08:48.629Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:09:15.983Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:09:27.977Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:52:09.861Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:53:05.439Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T15:54:06.974Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T16:26:09.532Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T16:29:11.042Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T16:50:16.370Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T17:03:18.165Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T17:07:19.305Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T17:25:19.490Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T17:35:46.226Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T17:40:18.477Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T17:57:39.756Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T18:11:32.145Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T18:15:48.243Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T18:33:39.972Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T18:47:45.351Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T18:48:46.416Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T19:10:47.892Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T19:20:49.577Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T19:21:51.774Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T19:45:46.543Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T19:54:57.796Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T19:55:59.264Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T20:00:45.135Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2023-01-30_07_07_52-2757792807318412129.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T20:00:45.176Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T20:00:45.269Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T20:00:45.300Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T20:00:45.337Z: JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-01-30T20:00:45.363Z: JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-01-30_07_07_52-2757792807318412129 is in state JOB_STATE_CANCELLING
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",> line 129, in <module>
    CombineTest().run()
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 152, in run
    state = self.result.wait_until_finish(duration=self.timeout_ms)
  File "<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1547, in wait_until_finish
    '{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting indefinitely. Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-01-30_07_07_52-2757792807318412129?project=<ProjectId>

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.

See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4h 55m 15s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zswhfykwuzpik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org